Blog

Rethinking QA at Equifax: Digital transformation's impact on QA

Author:

Tricentis Staff

Various contributors

Date: Sep. 25, 2020

This blog is part of a series featuring IT leaders who have driven successful quality transformations across organizations such as AGL, AXA, McDonald’s, and Zurich. These first-hand accounts are excerpted from the Tricentis-sponsored Capgemini report, Reimagine the future of quality assurance.

Raoul Hamilton-Smith, General Manager Product Architecture & CTO NZ Equifax

With about 1,000 employees, the Australian and New Zealand (ANZ) region represents about ten percent of the greater Equifax business — a Data, Analytics and Technology Company. The ANZ region is part of the International division which includes the UK, Canada, LATAM, India, Russia and Emerging Markets. ANZ represents about a third of the International division.

We are in a new era at Equifax. We have a new Group CTO, Bryson Koehler who is pushing an ambitious transformation program to migrate all of our products and services to the cloud and where there is a fit, move these onto global platforms. This will eventually lead to the closure of all our data centers. With the transformation, we will build full, secure continuous integration/continuous delivery (CI/CD) pipelines, including automated testing.

To support the transformation, we have an Engineering Handbook. It’s a Confluence site with hundreds of pages of information, with each discipline having its own chapter including Quality Assurance (QA). The Handbook provides us with the guard rails to deliver applications. It details the selection of tools to use and the methods to follow including what to automate, what tools to use and so on. As with other disciplines, we have instituted a global QA guild, with our local Head of Quality Assurance being a member. At the guild, the QA practice is discussed across the regions and with representatives from the Head Office in Atlanta.

As an example of how things are changing, we had a meeting with some of our project managers who are leading various initiatives to build or enhance our systems. The question came up (as it has done for many years), ‘if we build out the automation, that’s going to be more expensive and take more time, isn’t it?’ The answer is, ‘Of course initially — but you have to as this is how we do things now’. I think the penny has finally dropped that, moving forward, every component that’s built will come with a set of automated testing around it.

Ensuring quality is very important for our customers. We need to ensure that all of our deliverables are of top quality because we’re dealing with information that’s used in decision-making by financial institutions and other organizations. The quality of the data that is delivered is paramount, as is the stability of our systems.

There are many things that can potentially go wrong. The major risk we face is supplying reports with incorrect or missing information. We have had situations in the past where we have had data integrity issues. This can be very serious for us and for the financial institutions. These are the situations we must avoid.

And the way we do that is by relentless scenario testing across many thousands of permutations. It is not always a perfect solution though so if artificial intelligence could help us solve that problem, we would take that with both hands.

Always On

We aspire to provide ‘Always On’ services, which is a constant challenge. We are not perfect by any stretch but do focus much of our QA effort towards non-functional testing. When I joined the organization around eight years ago, we had only ever performance-tested one application. Now we performance-test all our customer-facing systems. In terms of automation at that time, there were some limited IBM robot scripts for a few test cases. The world has changed considerably since then.

Our QA organization is in a transition stage, as it has been for a while. We want to have all testing automated, but we’re not there yet. It’s a cultural shift as much as a technology shift to make this happen.

The challenge is to change the mindset to recognize there’s value in investing up-front in automation rather than making it an optional piece of our project delivery. What has happened as a result of the increased amount of work that we’ve put in to improving our security posture over the last 18 months, is that we’ve found ourselves releasing software much more frequently than we ever have for every application. That is a real challenge without automation.

We have found the lack of investment in automated testing upfront is now causing pain because we need to react faster than ever to any sort of vulnerability. All the applications have been through a variety of tests to bring them up to a higher standard than they were, which has entailed a significant effort to get those things tested and out the door.

Pivoting to automation

The organization has been set up for Agile delivery for quite some time, including a move to the Scaled Agile Framework around 18 months ago. A standard Agile team (or Squad) consists of a Product Owner, some Application Engineers, some QA analysts and a Scrum Master. As far as line management is concerned, there is a QA tower. However, the QA people are embedded in the Agile teams so their day-to- day leadership is via their Scrum Master/Project Manager.

What we have not been so good at is being very clear about the demand to automate testing. We probably haven’t shown all how that can be achieved, with some areas of delivery being better than others.

This is the challenge that we’re facing now — we have people who have been manually testing with automation skills that haven’t really had the opportunity to build out the automation. So right now, we are at the pivot point, whereby automation is the norm.

We call out the non-functional requirements up front when we start creating, enhancing or migrating an application. Included in those requirements are ones relating to data security and the security of the application, much of which will already be built into the framework of the cloud offering that will be provisioned. There is an alliance in the US that is called the Infrastructure as a Service Alliance. They are building an entire secure framework within AWS, GCP and Azure. We will work within that framework to provision our new applications, so much of the work required comes ‘out of the box’.

In addition to our normal functional and non-functional testing, we run our systems through penetration tests, and our code through security tests. Moreover, we are very careful with the data that is present in the cloud. Bear in mind that we’re at the beginning of this journey, not the end of it — but my understanding is that private data will be tokenized within the cloud. We also must be especially careful around what ends up in logs and what data ends up in system- transacting type databases.

Advantages of the cloud

Being able to utilize the full breadth of the cloud will provide us with several advantages. A current disadvantage is on-premise environments are expensive. Let’s say we have a record for every adult in Australia, which is around 18 million people. To replicate that in another environment is a big investment as we have a mainframe that sits at the core. Now with cloud, things change.

We will have the opportunity to spin up environments and spin them back down again. Provided we can construct a clean dataset, we can preserve that and continue to reuse it into the future. Demographics and name structures, for example, are unlikely to change that much. However, the primary dataset could always be tweaked as and when things change.

Once we crack that original dataset with the power of cloud, we’ll be able to spin up environments to do things like performance testing or mass testing, populate it with this synthetic data, run the tests, take the results out and then spin it back down again.

Moreover, we’ll have the opportunity to have many different developments occurring in parallel, whereas now we only have the one environment that we share that isn’t quite sized to production.

As you can see, many of the constraints we face will disappear in the future. Artificial intelligence will no doubt help us in some way, shape or form once that technology matures.

***

For additional insights from quality leaders, read the complete 100+ page Capgemini report, Reimagine the future of quality assurance.

Author:

Tricentis Staff

Various contributors

Date: Sep. 25, 2020

Related resources