Software Quality Days 2017, one of Europe’s largest software conferences, recently hosted their annual “Best Quality Tool” challenge. According to the official description, the challenge is “a competition that allows you to present a live solution of a practice-related task using your tools to the audience. The smartest solution will be identified together with the audience and a special jury.” This year’s challenge focused on testing microservice architectures.
Team Tricentis, composed of Robert Wagner and Martin Thaller, was invited to compete. They faced opponents from software testing tool vendors including CA Technologies and Microsoft.
Team Tricentis not only solved the challenge in about an hour (the fastest time), but they were also the only team to find the defect that was planted in the microservice! Their secret weapon? Tricentis Tosca, with integrated API Testing, Service Virtualization, and Test Data Management.
Robert Wagner from Team Tricentis explained…
We were asked to complete two challenges. The first was to create working API test cases before the system under test—a Speedometer service—was implemented. The Speedometer service was defined in a Swagger description, so we used Tricentis Tosca to scan the Swagger, then automatically create API test cases and service virtualization assets based on the defined behavior. We deployed the virtual Speedometer service, configured the API test cases to use the provided data, and then executed the data-driven API test cases against the service virtualization assets. It was simple! The implementation looked like this (OSV stands for Orchestrated Service Virtualization):
The goal of the next round was to find the defect planted in the actual Speedometer service, which was then implemented and ready for testing. The Speedometer service relied on a downstream Speedsensor service that was not yet available, so we had to mock the Speedsensor service. We needed to configure that mock to return a different RPM for each test case and have each test case’s verification vary accordingly.
For this challenge, we used the given Speedsensor request/response pair and deployed the mock service. To configure the Speedsensor’s service virtualization behavior on the fly, we added a Speedsensor mock configuration step to our API test cases and linked it to our test data source. When we ran our API test cases, the Speedsensor service virtualization asset responded with variable data (as required)—and this exposed the defect. It was immediately clear that the defect was a matter of fractional digits: the system under test sent more than was expected. Our ability to use the same data source across API tests and service virtualization/mocks was the secret to identifying this problem.
Here’s a quick diagram of our solution to this second testing challenge:
The team accomplished the whole challenge without any coding or scripting. They were able to perform all the required API testing and service virtualization by taking the assets that Tricentis Tosca generated, and moving them around with drag/drop as well as copy/paste.
As the testing challenge demonstrates, testing microservices requires heavy orchestration. The Tricentis API testing bundle, which includes Orchestrated Service Virtualization (OSV), is designed to address the complexities associated with testing microservice architectures…as well as any kind of enterprise Message-Oriented Middleware.