How Telia is modernizing performance testing with Tricentis NeoLoad
Hear from Telia, one of Sweden’s largest telecommunications companies, how they use Neoload for their performance testing.
For some years now, Tricentis NeoLoad has been championing continuous performance validation (CPV), a process that enables teams to produce faster applications, deliver new features and enhancements in less time, and simplify interactions across Dev, QA, Ops, and business stakeholders.
Whenever a company pioneers a new concept to the industry, you might be inclined to only think of it as the next batch of marketing mumbo jumbo designed to sell you software you don’t need. While it’s wise to remain skeptical of burgeoning initiatives, it’s also wise to, at the very least, explore concepts for yourself to determine if they could potentially improve your testing activities and application performance.
While some of you may have already looked into continuous performance validation, we’d like to take this opportunity to discuss the concept in more detail and explain why it works for organizations using processes and practices like Agile, DevOps, continuous integration, continuous development, and automated testing.
Let’s start by breaking down the phrase: what is performance validation, and why should you do it continuously?
Performance validation involves performance testing at every stage to prove that the performance is at the level you desire (this level being your predetermined SLAs). Validation requires that performance is meeting or exceeding the levels you previously set.
Continuous performance validation requires that teams test, monitor, and improve performance at every stage of the application development lifecycle, from development to production, utilizing automated and collaborative tooling.
User expectations of application performance have never been higher. As a result, it is absolutely essential for organizations to validate performance continuously throughout the SDLC. Doing so will allow for the early identification and rectification of coding errors and other performance issues that can become much more expensive and bothersome to resolve later in the cycle.
At the foundation of continuous performance validation lies building blocks, or a set of performance testing scenarios that sit alongside more traditional functional unit tests. Starting with component tests at the API level and progressing to system-wide business use cases that mimic how users actually use your app, these testing scenarios can be utilized throughout the development process so you can evaluate performance early and often, all along the way.
During this stage, optimize individual components to test. Utilize a performance testing tool like NeoLoad to test API, web services, and microservices – even if your app doesn’t have a GUI yet. NeoLoad will help you integrate with a CI server, enabling the execution of performance unit tests on every build. This will not only allow you to spot performance regressions early, but it will also ensure that service level agreements (SLAs) are met in each build.
Here, also using NeoLoad, conduct system-wide testing with realistic conditions. You can leverage your unit test library to build more complex scenarios that include network virtualization and device/browser simulation. With the NeoLoad hybrid cloud load generation platform, you also have the ability to add a layer of geographic realism to your tests.
In production, you’ll want to validate performance for live users. Proactively monitor service level agreements 24×7, run synthetic users alongside real users, and monitor for the same test cases you used in pre-preproduction.
With these load testing and performance monitoring tools, you will be able to collaborate throughout the entire software development lifecycle. Collaboration capabilities include:
Continuous performance validation is a powerful process for web and mobile applications that need to perform. If you want some expert advice on getting started, check out our white paper A practical guide to continuous performance testing.
The post was originally published in 2016 and was most recently updated in July 2021.