INR 1499/- for
The product is a customer facing and revenue generating application requiring high availability and high UI responsive expectation. The application used to follow a conventional CI/CD approach with built-in unit tests and functional tests as part of the testing procedure.
Need for the solution
Automated performance test was needed to be built inside CI/CD pipeline to improve and expedite the existing CI/CD testing procedure, so that the application performance under a pre-determined amount of load can be measured and any performance bugs or bottleneck is detected pro-actively.
The customer expected below features to be part of the performance pipeline for it to be an effective solution.
Implementation of the solution
Automatically Triggering performance tests in dev and QA in Jenkins Kubernetes environment without using Jenkins Performance Plugin
LoadRunner Enterprise 2020 tool is used for the design and execution of the performance test scenarios. The performance pipeline code is implemented as a part of a Jenkins shared library using GitHub as SCM. The pipeline code had set of steps for end-to-end test run and post-test run activities A docker image for running the performance test is defined with all the dependencies pre-installed. A single container Kubernetes Pod is defined to pull the performance image and run it as the container. The performance test is triggered from inside the container by calling the LoadRunner Enterprise authentication API. During the performance-test run the various statuses of the performance run are monitored programmatically by calling the various LoadRunner Enterprise authentication status-check API’s. Post-test completion a trend report is generated along with a HTML test summary report.
Provide users with a pass/fail result
Post-test completion test summary report is generated by the LoadRunner enterprise. The pipeline code downloads the test summary report document and parses it to extract all relevant transaction information like Transaction name, avg response time of the transactions, minimum response time, maximum response time, 90th percentile, count of passed transactions and failed transactions. The failure rate is calculated and compared with the error SLA. The 90-percentile response time is compared with the response time SLA. Based on the failure or success of the transaction the build status is determined.
Creating Trend Report
The trend report is generated and downloaded by making the relevant LRE API calls after completion of the test.
Display results in Report Portal
The performance pipeline is also integrated with Report Portal which is a Test automation analytics platform. The transaction details which were extracted from the HTML test summary report are mapped and formatted in Junit xml format and pushed to report portal using API . The S3 bucket and build path locations are also published in the Report Portal.
Storing result in AWS S3 bucket
Both trend report and summary report are downloaded and uploaded to AWS S3 bucket using AWS cli. The pod container helps in assigning the required IAM role for the upload.
Architecture Diagram of the solution