ITERGO extends continuous performance testing to Citrix-virtualized applications

Overview

ERGO is one of the major insurance groups in Germany and the rest of Europe. The Group has operations in around 30 countries worldwide, focusing mainly on Europe and Asia. In its home market of Germany, ERGO ranks among the leading providers across all segments. ITERGO is part of the technology and services management structure focusing on delivering future- and customer-oriented IT solutions to the ERGO Group. Triscon is a Vienna-based service partner that focuses on all aspects of performance testing. 

Like every large enterprise, ITERGO has a lot of different performance testing requirements: both monolithic enterprise applications (like SAP and Citrix) and microservice-based architectures, both end-to-end testing and API testing, etc. ITERGO realized that its highly manual test design and maintenance approach would not be able to keep pace with the accelerated volume, velocity, and variety of software releases.  

NeoLoad certified service partner Triscon was brought in to help ITERGO transform and modernize its performance testing practice. Triscon leveraged NeoLoad’s automation capabilities and core integrations to systematically and steadily accelerate performance testing cycles, increase test coverage, and improve root cause analysis — even for complex testing situations like Citrix-virtualized applications. Its goal at ITERGO is to fully automate “self-service” performance testing. 

With Tricentis NeoLoad, ITERGO has been able to progressively accelerate performance testing cycles while expanding test coverage — now automating tests for Citrix-virtualized apps — and is well on the road to “self-service” performance tests. 

Challenges

  • High degree of manual trial-and-error scripting for Citrix-virtualized applications 
  • Ensure that an upgrade to a critical business application did not degrade performance 
  • No existing baseline performance metrics for the critical Citrix application 
  • Performance testing could not keep pace with the increasing speed of releases 
  • Limited test coverage introduced business risk 

Solution

“The first thing we did was switch to a load testing tool that actually helps us spend less time in script maintenance and script test design by having cool auto-correlation features and frameworks, then automate test design by reusing existing functional tests and having fewer breakable scripts,” said Roman Ferstl of Triscon.  

They moved from Visual Studio to NeoLoad, which immediately reduced test design time by 30-50%. Then they utilized the NeoLoad-Selenium integration to convert functional tests to performance tests with just a click, further cutting script maintenance by 40-90%. “With these two steps, we have been able to go from 15-20 tests per year for about 5 applications to 200-250 tests per year for more than 40 applications. And we did this by needing only another two people to run 10X more tests and cover 8X more applications.” 

“We use the NeoLoad-Dynatrace integration to analyze our results in the most efficient way since we don’t have to re-perform tests to do extra profiling.” continued Roman. The Dynatrace-NeoLoad integration is bi-directional, meaning that whatever monitoring metrics are captured by Dynatrace are visible in NeoLoad and whatever performance test metrics are captured by NeoLoad are visible in Dynatrace. Nobody has to jump from screen to screen, cobbling together data from different tools. Root-cause analysis is fast and accurate. 

Taking automated performance testing
to the next level: Citrix 

Testing Citrix virtualized applications is notoriously difficult with hand-coding script tools. You often have to rewrite a script three or four times before you get it right. NeoLoad, on the other hand, uses a visual test design approach. You can see exactly what the test looks like in real time. It’s one and done — when you’ve finished designing the test, you know it’s accurate and ready to run load against.  

“Now, we knew there would be major changes by introducing a new system, so basically what we were doing is baseline test ‘before’ and ‘after’ to see if it performed better (or worse). And in addition to comparing performance of the new version vs the previous version, one of the key things we wanted to look at was the impact on infrastructure, whether ITERGO needed more resources,” Roman said. 

Product Mix

“Resource consumption of Citrix applications is an important topic for many customers; whether there’s still enough space in the infrastructure when rolling out new releases. By running such tests, you avoid getting blindsided by problems: there are no surprises, you know about them and can prepare for them.” 

“Resource consumption of Citrix applications is an important topic for many customers; whether there’s still enough space in the infrastructure when rolling out new releases. By running such tests, you avoid getting blindsided by problems: there are no surprises, you know about them and can prepare for them.” 

And beyond . . . self-service performance tests 

“Our goal at ITERGO is to automate performance tests by implementing automated quality gates,” said Roman. “We’ve already done this on a small scale: using NeoLoad, Dynatrace & Keptn to automatically evaluate performance tests and enable them to run in pipelines. Right now, that’s about 10-20 microservices, but we plan to scale up automated quality gates as part of onboarding new microservices — another 240-480 fully automated tests per year. With all the ingredients [NeoLoad, Dynatrace, and Keptn] already present, it just makes sense to combine them. Once we have everything set up for them [2-3 days per service], all they have to do is hit a button to start the test themselves. This enables them to be completely autonomous to check something on their own.” 

“We have been able to go from 15-20 test per year for about 5 applications to 200-250 tests per year for more than 40 applications.”

— Roman Ferstl, Triscon founder

Results

  • Went from 15-20 tests/year for < 5 applications to 200-250 tests/year for > 40 applications 

  • Reduced test design time by 30-50% 

  • Implemented fully automated “self-service” performance tests of OpenShift microservices 
  • Cut script maintenance by 40-90% 

  • Eliminated complex manual trial-and-error performance testing for Citrix-virtualized apps