Blog

Putting SLAs on the Agile board

Performance testing

Learn more about continuous performance testing and how to deliver performance at scale.

Author:

Guest Contributors

Date: Jul. 20, 2021

By Deb Cobb, Enterprise Product Manager

When it comes to application testing, many project managers and test leads do not routinely conduct performance and load testing early in the development lifecycle. Instead, they undertake performance and load testing after the application is complete, at the point where functional testing is applied. In fact, in many organizations, the performance test is frequently the last step — almost an afterthought –— conducted right before the application goes into production.

This approach creates a classic problem: late-stage testing. Whenever testers identify issues, developers must modify the long-finalized code to fix them. These code changes can impact other parts of the application, resulting in breaks. Addressing problems after the fact is time-consuming and expensive. Furthermore, any delay in releasing a new feature or a new app can directly impact revenue, competitive position, brand, and adoption.

Even in organizations running an Agile development process, the performance test may not be conducted in a genuinely Agile way. Delaying a performance test for the “final sprint” as a pre-release task treats application testing like a waterfall development step, and assumes all cost and risk associated. This post discusses how to make load testing a regular “early and often” exercise via SLA elevation to the Agile task board.

In a DevOps world, late-stage testing is not sustainable

In today’s app economy where organizations must prioritize both revenue attainment and the customer experience, late-stage testing is no longer a sustainable strategy. For brands to survive in this ever-evolving and competitive marketplace, they need to balance created code speed and quality with that of the code they release.

To cope with these competitive pressures, many organizations have implemented DevOps and Agile and Agile-like development methodologies. As a result, development teams are creating more code at a faster pace — code that needs to be thoroughly tested at an accelerated pace. All these pressures commingle to create more Agile testing environments where testers often own the entire, end-to-end testing process, including automated, unit, regression, and load and performance testing. In these environments, testers must keep up with the speed of development while also meeting heightened expectations of quality.

To ensure success in their testing endeavors, QA teams and performance testers are increasingly relying on SLAs, including them on their Agile task boards.

What is a performance SLA?

A service level agreement (SLA) is a contract between a service provider (be it in-house or an external service firm) and the client. An SLA defines the level of service the service provider must deliver on to ensure a satisfactory customer experience, including the various attributes depicting how the service will be delivered and the different thresholds that define performance. For example, a large company that outsources its help desk operations may require all incoming customer support emails to undergo a triage that triggers a receipt confirmation email response within one hour. The agreement allows the client to build its business and brand, knowing that the desired level of service will be executed according to operational specifications.

Defining SLAs

Testing teams can factor performance and load testing into their continuous integration (CI) process by focusing on performance SLAs. Performance testers should think of their SLAs regarding performance parameters because it’s an implicit contract made with end users. SLAs will vary by organization, but every one of them will describe a specific benchmark or standard that the application must meet. From a testing perspective, the SLA details a measurable requirement that can be tested and marked as “pass” or “fail,” and usually relates to system availability and response time. For example, an SLA may state that all pages should load within 2 seconds or that the first page of search results be displayed within 4 seconds. Though they are not hard-and-fast contracts, SLAs determine desired performance benchmarks.

When testers specify SLA items during the application design process, they can turn them into requirements that are included on the Agile task board. That way, developers will take performance into account as they develop the application, rather than approaching it as an afterthought.

The metrics of performance an SLA should address

SLAs can address all sorts of website metrics. The goal for performance testers is to generate key parameters specified up front. When this step becomes ingrained in organizational behavior, developers can code for performance at the beginning of development cycles.

As a first step, we recommend that QA teams have sufficient detail specified in their performance SLAs to support automated smoke testing. Consider what happens in a smoke test. Once built, the application deploys, and a series of tests are run to make sure that everything works. One of a series of necessary tests can be as simple as setting up a user and executing a user login. Adding a performance SLA to the smoke test may require that the login process complete within a given timeframe.

When teams add that requirement to the Agile task board, suddenly they are automating performance testing at the beginning of the development process. As more of the application develops, performance testers can add more requirements that relate to the modules and features created.

Although every application will have its critical performance metrics, it’s a good idea to identify a set of essential pages to test regularly. These pages can include the home page, shopping cart, contact form, search results, chat window, etc. The SLAs that teams create should define requirements for a set of metrics, such as:

  • DNS resolution time
  • Time-to-display
  • Time-to-last-byte
  • System uptime

Finally, determine a base level of network types (e.g., Wi-Fi, 4G, 5G) and platforms, operating systems, and browsers that each SLA will address. The multiple device and variable combinations can be daunting. To reduce resentment and discontent from developers, make sure to provide realistic expectations that apply to the majority of application end users. This will keep everyone happy!

The frequency of load and performance testing

Once SLAs are in place, how often should the team conduct performance and load tests? Remember that testing happens at different scales throughout the Agile development process. In many Agile shops, the automated build process includes some tests that are executed whenever a build compiles. At that point, there may be additional testing that occurs on a more thorough, though less frequent, basis. In many cases, test scripts execute whenever code is checked in.

Work to align automated performance testing to the development process stage. It’s unreasonable to expect that a full load test is conducted every time the application is smoke tested, but something simple can match the smoke test’s scope, intent, and turnaround time. These more limited tests will fit in nicely with what’s already happening. If QA teams apply their best judgment and align SLAs with the work in process, performance goals have a greater chance to be achieved.

The SLA: The trade secret for quick and efficient load testing

Every application must meet minimum performance SLAs. Because Agile teams are chartered to add more features and functionality to applications, optimizing application performance can become an afterthought. Further, user stories tend to be written from a functional perspective; they seldom specify application performance requirements. If Agile teams are to consider performance with the gravitas it deserves, then it needs to appear prominently on Agile task boards.

Baking SLAs into the creation of Agile task boards promotes collaboration between developer and testing teams and enhances visibility into actual application performance. When QA teams adopt this strategy, they experience several benefits. As their applications and process mature, they’ll end up with a rich library of SLA requirements throughout their portfolio of test structures. Also, they will have a collection of automated tests that can validate those SLAs within the right testing constructs: smoke testing, comprehensive automated tests, unit tests, and others. Securing these economies of scale will prove valuable to performance testers as they race to keep pace with aggressive development and release cycles. Ultimately, teams are enabled to test code faster and with greater efficiency.

Deb Cobb’s profile on LinkedIn

This blog was originally published in October 2018 and was refreshed in July 2021.

Performance testing

Learn more about continuous performance testing and how to deliver performance at scale.

Author:

Guest Contributors

Date: Jul. 20, 2021

Related resources