If you’re looking for yet another survey on the state of software testing, Wipro’s latest State of Quality Report is NOT for you. But, if you want to read some exclusive quality engineering research based on real data from top global organizations, you’re in for a treat.
According to Arun Kumar Melokote, Wipro Global Head, Application Engineering and DevOps:
The uniqueness of this report is its fact and ground data-based approach. This report analyses data from over 1,500+ QA projects across 400+ global organizations, data from 1000+ RFPs, consulting assignments, and customer, analyst and expert interactions spread across industries. We have also included data from five leading partners in the industry, our acquisitions and the larger ecosystem.
We strongly recommend that you download and read the complete (60-page) report. Here’s a quick recap of some of the more surprising findings:
Extreme quality automation impacts innovation even more than it impacts quality
As expected, high severity production defects decreased as automation increased. But what was startling was how dramatically innovation increased in response to greater quality automation. Look at the sharp rise in both Release count and Agile velocity (number of story points delivered in an Agile iteration) in the following graphic:
Industries that aren’t traditionally perceived as “digital disruptors” are making significant test automation gains
Financial services achieved the greatest increase in test automation (40.62% increase over the past year), but manufacturing and high tech (35.71%), Healthcare (34.61%) and Energy, Natural Resources and Utilities (33.33%) aren’t far behind.
There’s also a pretty interesting industry-to-industry discrepancy in the demand for quality engineering services:
Big data testing presents a tremendous challenge
If you were among the 1,000+ attendees of our recent webinar Data Warehouse Testing: The Next Opportunity for QA Leaders, you’re probably already aware that big data testing is a big opportunity. But you might be surprised by how many organizations are already struggling with big data testing. According to the report, over 80% of organizations involved in big data report that 1) Creating end-to-end big data automation is a challenge given the diversity in technology and application landscape, and 2) When it comes to the skills required, most companies say that Big Data testers have a limited understanding of the components of the ecosystem.
The exponential growth in the technology landscape, where Big Data tools demand stands at 35% of the total demand, makes it harder to identify testers with the right skill set for Big Data testing.
Test Data Management isn’t progressing as fast as we expected
With GDPR coming into effect in May of 2018, we expected that this would be the impetus most organizations needed to finally reinvent their painful test data management processes. The report shows that there was some progress— but not much:
Given this tepid “progress,” it’s not surprising that 86% of organizations still report that test data management is a struggle— and only 5% can get the test data they need within 1-3 days.