The Software Fail Watch: 2016 in Review is a sobering reminder of how even a single software bug can cripple an enterprise. With 4.4 billion people and 1.1 trillion in assets impacted by software failures in 2016, it’s hard to argue that “more of the same” is the best path forward for software development professionals.
As the demand for the latest and greatest in technology and convenience grows, so does the need for software testers to protect their users and their brand from the potential influx of software failures. Our goal at Tricentis is to help testers succeed in this role—enabling fast, efficient, comprehensive testing that’s designed to support Continuous Testing, Agile, and DevOps.
Ultimately, we want to ensure that the inevitable software bugs are found by your testers, not your customers.
Wolfgang Platz, Tricentis Co-Founder and CPO
The Tricentis Software Fail Watch is a collection of software bugs found in a year’s worth of English language news articles. To find the stories, we set up a Google account with an alert for phrases such as “software glitch” and “software bug”. Then we manually sorted through each of the alerts, picking out promising headlines, reading the articles for relevance, and noting down any specific details of interest. Here’s a quick overview of what we found:
A high-level view reveals clear patterns in where and how these software fails occur. Government-related software fails dominate the charts, with an average of 15 fails per month. Retail and Transportation are tied in second place, both clocking in an average of 9 fails per month.
Many trends observed in last year’s Software Fail Watch continued this year. For example, Transportation’s software fails peaked in late spring, while Retail’s software fails rose steadily in the months leading up to the Christmas holidays. The Finance and Entertainment industries kept a fairly low profile over the course of the year, both averaging just 2 software fails per month.
The wild card in 2016 is the Services industry, representing internet, electricity, and telecom, etc. The numbers of software fails jumped erratically from month to month, peaking in May with 11 recorded software fails.
Comparing 2016 data vs. 2015 throws the picture into even sharper relief.
If anything, the need for better software testing is only growing. Today’s software testing processes are still 80% manual, on average. Based on our software fail results, it seems these traditional testing processes aren’t successfully identifying high-risk software releases in today’s highly-accelerated development processes.
What can be done about this disconnect? Read what Gartner, Forrester, and Tricentis think about where software testing stands today and what the future holds.
Stay tuned for more blogs on the software fail watch findings… or download the complete Software Fail Watch report now.