Blog by : Nikhil Ratnaparkhi

10 May 2023

Microservices architecture adoption requires an overhaul of the performance test approach. The traditional approach was running a batch of performance tests against ‘production inspired’ workloads in a ‘production like’ test environment. This approach introduces many complexities that go against the grain of DevOps and microservices driven development. Several pieces must come together before a batch test can be performed in an integrated environment. Multiple teams must be on stand-by while the test is conducted in a lower environment. The cycle time to review results, fix bugs and retest is too high. The infrastructure and engineering bandwidth cost of the whole exercise outweighs the benefits derived.

With containerization and auto-scaling infrastructure, it is now possible to shift performance tests left. Performance tests can be executed for individual services at a container level. We can now build early confidence with low traffic tests and then scale the deployment for production volumes. Modern deployment strategies, such as Blue-Green deployment, allow for a final assessment of how new code is handling transaction volumes during go-live.

How Lowe’s is Transforming Performance Testing

Performance testing helps in keeping a check on the behaviour of an application across different situations. A system works effectively with a specific number of concurrent users, but it might dysfunction with additional thousands during peak traffic times. Performance tests, thus, help establish scalability, speed, and stability of the software applications. Different types of performance tests simulate different possible user scenarios and understand the different behaviours of the applications.

With rising competition in the digital space and the requirement of being present in the top rankings of the category, performance testing has emerged as critical for enterprises. It’s essential to ensure speed, stability, dependability, and the scalability of the application. All applications are built with certain expectations in place and are supposed to present specific results.

The biggest challenge to early performance tests is the tooling to generate custom loads quickly and the telemetry setup to gather key metrics like CPU, memory and network utilization. At Lowe’s, these challenges are addressed through a performance test tool built in-house as part of the Engineering Services within the broader PaaS platform. With this tool at hand, development engineers are not dependent on a separate team to run early performance tests. The tool is powered by K6 – a developer friendly, API driven and very light, load testing solution from Grafana.

The performance testing tool comes integrated with the PaaS Console, which helps with quick onboarding of any service hosted on the hybrid cloud platform. The ability to quickly configure performance tests, spawn a container and run load test on a target service allows engineers to frequently assess the ability of each microservice to handle traffic while meeting performance, resiliency and availability SLOs. The tool GUI makes it easier for engineers to decide test-run-parameters such as protocol, rate of ramp-up, target environment and test duration. The tool integrates reports detailing error rates, response times and resource utilization. Users can compare performance metrics between runs or with production, for live services. Besides load testing, microservice level load generation tools also help with chaos engineering experiments that require load generation to simulate peak volume production scenarios.

As we accelerate service development and deployment, adopting performance test accelerators give engineers early clarity into production readiness and scalability of their code.

Bottom Line

Microservice-based architecture is constantly evolving, and software developers are embracing the change. New-age business needs and rising competition demand a best-in-class product, which has made performance testing of microservices-based applications more important than ever. Each microservice must be thoroughly performance tested and validated to determine the stability and elasticity of the application. This will go a long way in ensuring that the microservice is able to handle a large amount of traffic, qualifying the application as a great product.