Investigating models from workload patterns in enterprise systems for web performance and capacity

  • Nitin Khosla

    Student thesis: Doctoral Thesis

    Abstract

    Many websites are slow, especially when a large number of users interact with the websites simultaneously to seek prompt crucial information in emergency situations such as floods or fire rescue information. Slow web pages and loading times are very important issues as internet users are very sensitive to latency. The possibility of abandoning page loads is very high if the page loading time grows above 5 seconds. As a result, it will lead to a significant reduction in revenue and performance in big enterprises, businesses, and may impact adversely to their reputation or brand image. An Enterprise System Architecture (ESA) is made up of large numbers of lightly utilized and managed servers and computers that incur high cost of ownership including rent, power for computing, licensing cost, and high cost of human resources and human management activities. Developing computational algorithms and a practical methodology to model and predict users’ behaviours using the internet has challenged researchers in the field of website performance and modelling especially under high load, stress and burst web traffic conditions. The intrinsic challenge to represent the users’ dynamism and behaviour creating multiple transactions and respective workload patterns at the server impacting IT system’s performance has motivated this research to investigate the transactional workload patterns and to develop a novel methodology to extract insights of the performance of web applications under peak load, stress load, and volatile web-traffic conditions including bursts in web traffic. The investigation includes modelling using semi supervised learning, memory leaks under an extensive period of stress impacting website’s response time, impact of wide area network (WAN) parameters on transactional response times and capacity of IT resources because of high web traffic distributed load impacting on the performance on desktop and mobile (smart phone) devices. Key performance metrics were evaluated and mapping of test data of ESA test environment to web applications in the live environment is developed and evaluated for these load conditions. The main goals of this research work include (i) extracting insights of users’ behaviour from workload transactional data of the real-world usage, (ii) identifying critical factors impacting the performance of web applications, (iii) analysing impact of the web-traffic generated from distributed locations, (iv) to identify and predict user load, (v) to develop a machine learning based model for predictive analytics for web applications’ performance and system capacity towards optimisation, (vi) evaluate the impact of WAN parameters on transactional responses, (vii) investigate and develop a mapping between volume of the test data sample used for performance measurement in ESA test environment to the response times of web applications in live environment. Following objectives are achieved in this research thesis using machine learning models where a semi-supervised learning model has shown promising results for estimating performance of web applications with a dynamic data driven performance testing approach.
    (a) A commonly used enterprise system architecture (ESA) for a large enterprise IT system with 129 web applications connectivity is characterised and used in the investigation for modelling improved performance and its effectiveness towards achieving optimisation.
    (b) A novel model is developed (called DDPM for Data Driven Performance Model) to investigate a practical data-driven web performance evaluation approach for performance and load testing, and capacity modelling in the test environment of the ESA system. This approach enabled dynamic tuning of performance parameters based upon the variations in web traffic observed in real world while tuning the dN factor and to identify early detection of performance issues using semi-supervised learning and draw insights to use for optimisation. Performance experiments were performed on more than 380 business transactions of 23 different enterprise applications in a test environment which was configured like a real-world web infrastructure.
    (c) The motivating research investigation environment was simulated for ESA in a wide area network (WESA) configuration using a WAN emulator (called WANSim) which was used to generate web traffic simulating real web traffic scenarios originating from globally distributed locations. Performance testing experiments were conducted to identify different WAN factors impacting performance of web applications and impact on IT system resources.
    (d) A new approach called “envelopes” using semi-supervised learning was developed and implemented within ESA to identify the memory leak problems in an enterprise large IT system set-up. In the past three years, a large number of websites providing services to millions of people crashed suddenly at the time when people were desperate to find vital information related to COVID-19 pandemic. And memory leakage was one of the key reasons contributing to these system failures.
    (e) Based upon our observations of maximum usages of key business transactions in real time enterprise systems and their critical impact on the responses to the services provided, our wide area network and distributed load simulation experiments included the following seven business critical transactions – Payments, Search, eLodgements, File Transfers, Storage, Risk Evaluation and Emails. Results of these experiments demonstrated the four key parameters mostly impacting the performance of web applications – Bandwidth, Latency, Packet Loss, and Background Noise. Transactional response times observed at 11 different countries were analysed, web traffic patterns were simulated with thousands of concurrent virtual users which enabled developing performance models of key transactions mitigating risks of website crash and IT system failures.
    (f) We have investigated and proposed a new performance metric called – “Volume Data Loading (VDL)” factor and represented this by ∂v. This enables business owners and IT architects to estimate the performance of key business transactions prior to the release of an application in the real world. While using the scrambled test data, sample size in the range of 1% to 30% of the total of the production data, our experiments found the new performance metrics is an objective criterion of defining and providing quantitative measurements of performance of websites on standard computing devices and mobile phones including smart phones.
    (g) Performance of Web applications, both on desktop and on mobile devices – smartphones, were evaluated against the evaluation metrics to demonstrate improvements in ESA through a data-driven analytics model.
    The thesis concludes with a summary of research outcomes against each identified research question. The research work has also identified future extensions to the research work including similar types of workload analysis to model potential cyber-attacks predicting such web-traffic patterns. The outcomes of this research can also be extended to apply to other ESAs for complex problems such as medical investigations, analysing patterns of electrocardiogram to predict heart attacks, model performance on web-traffic insights through Low Earth Orbit (LEO) satellites channels, and web traffic patterns related to cyber-attacks.
    Date of Award2023
    Original languageEnglish
    SupervisorDharmendra Sharma AM PhD (Supervisor) & Dat TRAN (Supervisor)

    Cite this

    '