SRE + performance engineering: Improve collaboration to release performant applications faster
Shift left with observability to release quality apps faster. Join...
We were honored to have Jamie White—Senior Director, Head of Global Application Operations at GlaxoSmithKline (GSK)—recently share his team’s SAP test automation strategy with the SAP community.
Jamie started by outlining GSK’s ultimate goal: reducing testing time and costs while maintaining a high level of quality to mitigate risks. He then reviewed the challenges they faced, including a manual change assessment process and test automation tools that weren’t fit for Agile. To close, he detailed GSK’s journey to a Continuous Testing strategy focused on “Automation as a Service.” Along the way, he highlighted the expert guidance that Wipro and Tricentis provided at every stage of the journey.
Here’s the full presentation, followed by the transcript.
I’m excited to talk to you about some of the great work that we’ve done with Tricentis. I’ve been in the pharmaceutical industry for around 20 years now. I’m the head of our global application platform, so every global platform that is in operation at GSK today falls within my accountability.
I took over the testing function three years ago. Testing at GSK has two kinds of screening purposes. First is software quality: to make sure we’re protecting our business operations. Second is computer system validation—which is really around us ensuring product quality, patient safety, and satisfying the needs of our external regulatory authority. Each is equally important. But that has led GSK down a path where we were spending a significant amount of money over the years, and it was taking us a long time to test our application platform.
We had a number of challenges when I first took accountability for the team. One, our business was shifting. We’re shifting to a data-driven organization, a product-aligned organization. We knew Agile was coming in, we had moved away from these very large deployment programs that were all waterfall and cost hundreds of millions of dollars, to the product teams managing change on a rapid basis. We knew that we needed to also maintain quality, we knew that we needed to reduce the cycle time for us to be able to deploy changes into production. But we always had to maintain that quality. And when I say quality, I’m really talking about both aspects: the validation quality as well as software quality.
Now historically, our teams have a very large SAP estate. We’ve got a very large number of consultants that work in the environment, and the majority of our testing was done manually. We would raise changes, they would identify what they thought needed to be tested, and those test cases were developed manually. Then, they were executed by this same pool of functional consultants. That cost us a lot of money. These are highly-paid consultants that are highly-experienced. We really needed to rethink everything that we’ve done in the past and how we were going to work going forward. We knew that we had to maintain the quality, we knew that we had to do it faster. And to do it faster, we had to be far more efficient. But we had to do it at a significantly reduced cost.
So those were all of the challenges. Now that led us down the path of a very intense assessment: looking at what tools are on the marketplace and how can we modify the way that we work today. The end result of that was that the Tricentis toolset was clearly the market leader and the tool that would enable us to achieve the business outcome that we needed to meet the needs of the organization. So, both from a test automation standpoint, impact analysis, all of those tools combined were there to help us achieve that efficiency at the right costs.
Now we can have a pool of testing resources that we don’t have to pay as much as an SAP functional consultant that’s been working 20 years in the industry. But we can also make sure that we’re managing our risk to our production operations or business operations. We can make sure that we’re meeting our regulatory expectations, and really transforming the way we operate as an organization.
We tried automation in the past with UFT. The outcome was always the same: the effort required to maintain those scripts was significantly high. The net result was the functional teams just saying, “It’s easier for me to test that on my own than it is to try to automate and maintain those automation scripts.” Now having gone down the path with Tricentis, we’re easily able to maintain the scripts. We’ve got direct alignment into our product teams. And we’re able to manage these efficiently and effectively as we deploy into production. It was really a journey from what was almost entirely manual.
We had a significant number of documentation sets that were built in SAP Solution Manager. We knew that we then had to do a manual review of test cases that we had to update and manage in HP ALM. We would try to automate those in UFT and that just kind of fell apart. We maintain ALM because that was our system of record from a validation standpoint, but there’s very little automation. We never achieved the ambition that we had set for ourselves over many years, even though we’ve tried time and time again and invested a significant amount of money to try to achieve that ambition.
So, we made the decision to move to Tricentis. We’re now operating an environment where we still have SAP Solution Manager. We use Rev-Trac on top of that to manage change and release management for all of our software releases across the SAP estate. We’ve also got LiveCompare helping to do the impact assessment on those changes. That helps us identify what test cases we need to execute using the Tricentis toolset.
The effort required for automation in Tricentis is significantly less than we were experiencing with UFT. That really has led us down the path where our functional teams have gotten a lot more comfortable using the tool, and they’re also confident in the results that it produces. Again, this is about them having competence, so we can start to shift that accountability into our centralized testing function. Now we’ve simplified the state, and we’ve got the tooling in place that we need to achieve the outcomes that we need to deliver for the business. Again, my primary expectation is that we maintain our business operations. We’ve got to test this functionality properly. But we need to test it at the right price point.
I’ve got a couple of benefit cases here to give you an indication of what we’ve achieved. We not only need to deliver the releases, with regression testing part of a weekly release or a monthly release. We also want to make sure that we have the ability to build confidence that there is no defect leakage across the estate.
At any one point in time at GSK, there’s a significant number of changes that are moving across our SAP systems. We wanted to identify the top critical processes that we need to make sure, regardless of anything else, we could continue ship product to our customers. That was really the foundation for our smoke test pack. This was one of those critical processes: how do we make sure that we’re continuously testing, not just [testing] as part of a release, but continuously testing those to make sure we’re confident that we’re not going to impact our business negatively.
So we identified all those as a part of the smoke pack. We’ve fully automated that. Now, the volume of testing that previously would have taken us three weeks to execute now takes just a little over a day—and we continue to improve on that. We’re now up to 85 end-to-end scenarios, and we’re going to continue to expand that to the point where we get full coverage—not for regression testing, but really focused on what are those critical end-to-end scenarios that enable GSK to continue to operate effectively.
We’ve achieved a huge amount of benefit with this. It gives us confidence: we trap defects that would have otherwise gone into production. That happens regularly, and we monitor that as a key metric. And it really gives IT leaders the confidence we need, while also giving our business customers confidence that we are managing this proactively and effectively going forward.
We also manage individual incremental changes in the product areas. Although this is a smaller scale, the intent is really to show that within each of the individual product teams, there are many use cases that you’ll find to the Tricentis toolset. This is one where we had a labeling project. Labeling for GSK is very important. Again, there’s high impact on regulatory expectations if we make mistakes on these labels. It’s really important that the data is accurate, it’s really important that this process works effectively. These labeling machines are right on our production line; if these things go down, we stop an entire manufacturing plant from operating.
Here, software testing that would have previously taken us a week is now down to four hours. We can quickly generate label changes, we can quickly test those within a day, and get them deployed into production safely and effectively. And again, this is not only a huge benefit to us, but also to our customers. They may have a new product, a new label change that needs to be made. That would have taken us quite a while in the past. Now, we can do that rapidly, and we can also do that safely.
Again, this is just another example where we’ve seen tremendous value. And there’s many of these across each of our product areas in the SAP estate.
Now, we have established this service. Not only are we delivering test automation and SAP, we’ve got a standardized service that we can leverage across the entire tech function in GSK to deliver consultancy in terms of how to automate, where to automate, how to utilize the toolset. We can scale that service completely—not just within the SAP estate, but across multiple applications.
We can now test end-to-end scenarios across multiple systems in our ecosystem. You can take the Order to Cash process and say “Let’s test a web application accepting orders…those orders coming down to SAP…those SAP orders then generating the billing documents and going down to our warehouse systems…That is fully automated. We can test that entire end-to-end process leveraging these modular test assets that we built.
From an automation standpoint, we weren’t able to do that in UFT. And this way, we can do this at a great cost saving. We no longer need to invest in these highly-specialized very expensive resources, not only in SAP, but across the estate. We can scale this service at the right cost to meet the needs of the organization. And that might be temporary for a project, or it might be ongoing to embed into product teams across the organization. It’s a huge value to GSK.
It’s been a great partnership with Tricentis. The account team there has been fantastic. We really partner together well—not only to reach where we’re at today but looking at products going forward and how we’re going to manage test data, some of the newer offerings from the Tricentis team.
How do you handle computer system validation for 21 CFR Part 11? compliance?
Good question. That’s a key part of what we do. The majority of our regulatory reporting comes out of ALM. We generate the requirements trace matrix out of ALM today. We are looking at qTest from Tricentis as a potential long-term replacement, but we’re not doing that yet. Overall, the change process in Rev-Trac is the start from a regulatory standpoint, but testing requirements evidence comes out of ALM.
How do you work with test data management?
We’ve got a test system in SAP, we’ve got multiple test systems that we use to maintain a consistent data set. We use client copies effectively to manage that data set from an SAP standpoint. We’re looking at the DI platform though, because now we’re thinking about how we extend that, that data management piece out beyond the SAP estate. We’re running these end-to-end scenarios, so we need to make sure we’ve got consistent test data. That’s a bigger challenge, and I think we’re really starting to explore that. We’ll continue to drive that strategy.
But specifically in SAP today, we have static data sets that we tend to use for testing. If that didn’t work, we’d build that into our test automation scripts. So the test team has scripts and processes in place to generate that test data prior to execution, and they’ll execute those today to maintain that. But, we are exploring the DI tool as a potential replacement for that going forward.
What other tools were considered by GSK?
If you go back to that Gartner Magic Quadrant, you can almost take every tool that was on there. In a way, GSK has had those tools implemented on smaller scales. We looked at Eggplant, we looked at Worksoft, we looked at every single platform that was out there and said, “Which of these is actually going to meet the needs of this organization?” We didn’t limit it; we had about 10 different software components we were looking at initially. We explored which of these can actually meet our needs? Which of these are best in class? Which of these do we believe are going to be sustainable in the long term and best able to deliver the ambition that we set out?
It was a wide search. We looked at everything in that Magic Quadrant and then we narrowed that down to the top three. We said, “Okay, which of these is really going to meet the needs?” Tricentis came out clearly as the top contender.
What were some of the concerns prioritizing the selection of Tricentis? And do you feel with the outcome that you’ve achieved that it was a well-thought decision to make to select Tricentis?
I think I probably touched on this in many ways, right? It was all around what can we do at the best price, delivering the right quality and delivering fast. Those were the ultimate criteria. Also, can it meet our regulatory expectations? Is the platform validated? Can we satisfy the needs of our regulatory authorities? Can we make sure that we’re protecting our business operations? And can we do it quickly?
I think those were really the lenses that my team and I were applying to it. Tricentis was definitely the right decision for GSK. I wouldn’t hesitate to remake that decision. I think that it’s helped us achieve that ambition. We’ve proven it through what we’ve achieved over the last 24 months, both from the smoke pack, automated regression testing individual projects, and the automation artifacts that we built for those projects.
We’re exploring many other opportunities outside the SAP space as well, where we’re able to achieve similar benefits to those we’ve achieved in the SAP space. This isn’t specifically for SAP. This is something that we’ve scaled across the enterprise, and it’s delivering significant savings to the GSK organization as a whole.
Does this tool make sense for a smaller company with an IT department or 30 people across the entire business?
I guess that’s probably more of a commercial question than a capability question. The example I’ve given was a small product team. The skillset required to develop automation with Tosca specifically is much easier than you see with UFT or any other script-based automation tool. So, you can have an IT consultant do some training on Tosca, and they’ll be automating quickly.
Commercials aside, I think as long as you’ve got a team of people that you can leverage to learn the tool, then you’ll absolutely be able to achieve automation within a smaller organization. And you’ll be able to do that with the resources that you have in play today.
Shift left with observability to release quality apps faster. Join...
Explore common regression testing challenges faced by Agile teams –...
Ensure reliable, scalable application performance across on-prem,...
Ensure SAP data accuracy & reliability. Learn risks, key...
Watch this webinar to learn some advanced strategies for...