Open Source

Load testing

Load and performance testing trends from ‘The State of Open Source Testing’ survey

By Ivan Vanderbyl

Note: This post was originally published on the Flood blog.

Tricentis, in collaboration with Flood, SpecFlow, and TestProject, recently conducted our annual survey, “The State Of Open Source Testing,” and compiled results from nearly 2,000 responses.

In this post, we’ll explore the findings relating to performance and load software testing tools for 2020.

Our biggest surprises this year were the combination of load testing and continuous integration, which has seen a considerable increase from previous years and the consensus that it was a more niche workflow.

Not so much of a surprise is the continued adoption of open source tools across organizations of all sizes, including large enterprises that have historically shunned open source products due to existing commercial vendor agreements. However, as we’ll see, security and training remain as roadblocks for around a sixth of companies surveyed.

In terms of geographies and demographics, Asia is leading the adoption of open source software testing tools, making up 61%. Europe leads the western market at 16%, followed by the United States at 14%, and Australia trailing at 3%.

India and Vietnam are growing faster than any other region, and I expect we’ll see substantial growth here in the coming years.

The main questions we want to answer from this survey are: Is the role of performance testing changing? And is open source testing here for the long haul?

Major roles in testing

It comes as no surprise that QA teams are still responsible for about half of testing. Engineers do the remainder, with broad functional responsibilities across operations, development, security, and performance.

We’ve seen a steady increase in our customers’ technical skills over the last five years due to the technical nature of load testing. Still, we’re now seeing this spread into broader testing disciplines, driven by the adoption of open source tools.

Additionally, 33% of companies surveyed had specifically tasked performance engineers or teams that conduct performance tests regularly with load testing, which means QA is doing slightly less load testing overall than specialized teams.

And yet, around 10% of companies surveyed said that nobody was explicitly responsible for load testing.

Experience level

Due to the rise of automated testing over the last 20 years, we haven’t seen any breakout group in terms of years of experience. 

There is perhaps a slight increase in the number of people who started in the field around 2010 to 2015 – those who now have five to 10 years under the belt.

Top programming languages

If you’re not doing model-based testing, you’re likely writing tests in some form of programming language. 

It’s no surprise that Java is still the most popular choice, no doubt driven by Selenium/TestNG. On the other hand, 40% of customers use JavaScript, which compares well with the adoption of JavaScript in all fields, not just testing or web apps. 

“Any application that can be written in JavaScript, will eventually be written in JavaScript.”

— Jeff Atwood, Cofounder of StackOverflow

Biggest roadblocks to adoption

While a wide range of organizations have adopted open source tools, they still face roadblocks in some enterprises.

We’ll break these down as support and training, security, and technical capabilities.

Support and training

Support and training remain a roadblock for adoption across all open source projects, with very few investing the time required to write proper documentation, tutorials, and training material. This is a critical challenge is training your team when adopting a new tool.

For open source, this has spawned multiple companies that focus specifically on supporting the implementation of open source software testing tools within organizations that would usually get training directly from a vendor.

Another risk is how often a project is updated, as most open-source projects don’t follow regular release cadence, or provide an easy upgrade path between major versions – a challenge with all software, whether you pay for it or not.

As more and more testing tools move to the cloud, updates are becoming a thing of the past.


Interestingly, 15% of organizations still say that security and perceived vulnerabilities are a roadblock. However, when you talk to security experts, they say open source provides greater assurance of security because it increases the pool of eyes looking for vulnerabilities. 

However, the ability of maintainers to implement fixes can increase security or leave projects open to exploitation. But at least the vulnerabilities are known.

In recent years, an entire ecosystem of code analysis tools has been developed, which help automate the discovery and repair of vulnerabilities in open source projects, so it will be interesting to see how security trends as a concern next year.

Technical capabilities

Support for more esoteric protocols within a chosen tool, setting up a test environment, maintaining test scripts, and managing test data were all viewed as technical impediments to regularly conducting performance tests.

Testing systems built on SAP or Citrix that don’t expose accessible testing interfaces compound the problems faced by open source tools in the enterprise.

With SAP, organizations spend millions on the software and the internal adoption, but finally, the transformation from one legacy system to something a few generations newer.

The CEO of one of the world’s largest pharmaceutical companies once said it would be a competitive advantage to implement SAP for less money and time than their competitors.

These types of enterprises are still the holdout for open source adoption.

Biggest benefits

While 16% of respondents identified customer support as a roadblock, a similarly sized cohort prefers the community-based support offered by open source tools.

The number one driver of open source adoption is still cost, which is no surprise as commercial testing tools have historically run into the hundreds of thousands of dollars. (Though as noted earlier, many cite the lack of support and training as a major roadblock, which is compounded when organizations are not willing or able to dedicate resources to writing proper documentation, tutorials, and training material.)

Performance testing tools have a more storied history than a lot of modern QA testing tools, which is why performance testers view cost as less of a motivator for adoption than technical capabilities.

Lastly, tooling flexibility and ease of customization is still a strong driver for most surveyed companies. (Unlike some other commercial tool vendors which reject the importance of open source, Tricentis fully embraces open source as a crucial part of our product strategy.)

The role of load testing

Automated testing has well and truly unseated manual load testing, and load testing makes up nearly a third of testing overall.

At Flood, we’ve seen a strong trend to browser-based load testing in the last two years, as it now makes up 35% of all tests run on Flood. As a result of this, we hired three more engineers for the Element core team this year.

An interesting finding we’re seeing is that performance testing has moved from a reactive to proactive testing discipline, with nearly two-thirds of customers conducting performance tests within a typical sprint cycle.

Thanks to easy integrations between Flood and most continuous deployment services, you can easily add load testing into your release process to prevent regressions that could catch you in production.


For performance testers, open source tooling is critical to the job, which shows a significant move away from vendor apps of the past decades.

JMeter is still the most popular, with a little over half the market. The developer-focused tools such as Element, K6, Locust, and Gatling have seen an increase in adoption and now make up most of the backfield.

Final thoughts

Is the role of performance testing changing? I think there is strong evidence that more generalists are conducting performance testing in development teams.

Still curious about the results? Check out the survey results infographic.