This blog is part of a series featuring IT leaders who have driven successful quality transformations across organizations such as AGL, AXA, Equifax, McDonald’s, and Zurich. These first-hand accounts are excerpted from the Tricentis-sponsored Capgemini report, Reimagine the future of quality assurance.
Head of Technology Risk Compliance and Controls for Infrastructure & Operations at a leading Australian Bank
Outcomes from Australia’s Royal Commission have had a significant impact on the financial services industry. The Royal Commission comes on the back of Banking Executive Accountability Reforms (BEAR). This places more obligations on executives and holds them accountable for the actions of their organization. Industry wide there is a lot more interest from executives on the detail of what is happening inside the organization.f
As a result, the role of Quality Assurance (QA) is about expanding its responsibility to deliver quality products to market and they are fit for purpose, but also that all of the compliance and regulatory obligations have been met. Lastly, there is a need for increased reporting to being able to demonstrate back to the executives that everything has been executed as per instructions and obligations. These new services need to be introduced into operating models which is something that we have never had to address as an industry before.
Getting it right the first time
QA in the open banking and open integration environments around financial services is another big area. When all services were all contained within your organization, if things were not necessarily ideal you had the internal opportunity to self-rectify. In a future world, where everything is public and the data is provided by one organization to another on behalf of the customer, if there are errors in those transactions or advertisements, it can have effects either positively or negatively towards the customer’s interaction. As such, the importance of getting it right immediately is increasing with these new ecosystems.
The Royal Commission has basically directed that the obligations of organizations and executives extends into these ecosystems and there is little tolerance for getting it wrong for the customer. The expectation is you have to get it right the first time and you have to support your customers in the custody of their data and service.
Even controls like the European legislation around data, the General Data Protection Regulation (GDPR), are focused at the individual level. Was one individual impacted as a result of something that your organization did? And how are you assuring that those individuals are not impacted by the changes your organization or the ecosystem in which you participate make? Assurance over that is something that is very challenging to implement to ensure that everybody is doing the right thing.
Tools, techniques and testing in a digital transformation world
There is still a lot of manual testing and there is a challenge with the breadth of coverage that is now becoming even more critical. And, due to the complicated networking the large financial services companies have, co-interdependency of all of these things, open banking or Software-as-a-Service (SaaS), you do not always know where things are coming from. The ability to ensure things are happening at the right times in the right places means your test coverage needs to be expanding continuously. That does create a very significant challenge for people sitting with legacy technology because a lot of the modern tools do not lend themselves easily to a legacy platform.
It is a delicate balance that everybody has to run — balancing the risks you take versus the breadth you have versus the amount of money you invest into bringing up to speed your legacy technology, compared with what you need to get into market and what you are going to do when you get there. There has been some interesting rationalization. If you look at the financial services sector, since the Royal Commission there has been a significant reduction in product offerings. Almost all of the top tier one and two financial service organizations have exited products like lending to self-managed super funds and they have tightened up on personal and property lending.
They have removed a swag of products from the market to simplify the product offering and they are trying to then get better at the ones they are offering. Some of those decisions are because they are no longer commercially viable given the new market or because of the new rules. Those products may no longer be relevant, but also it is about having to make sure that we can deliver that consistent experience on the products that we have, so they are tackling the problem in multiple ways.
Blurring the lines between security, compliance, testing, and performance risk
There is definitely a desire to move faster. Every business has been crying out for that forever. It doesn’t matter what industry you’re in, everybody wants it happening faster. There is a certain rate of change that particular organizations or industries can run at irrespective of their desire. But it is also vital to get the decisions up the chain quickly and concisely to support rapid decision-making. As a result of the Royal Commission and BEAR, executives are being held to account for those decisions.
So the ability to be able to do things faster — but get greater visibility so you can report that back — is becoming the key. It is not just about how fast we can move through our development life cycle. The problem is that, within organizations in the financial services sector, following the Royal Commission, the ability to get the executives on board and comfortable with allowing the process to move into the next phases or released to market or whatever those steps are, is becoming the limiting speed factor. The way the tools can get that information and present them in consumable format is the key.
What we see today is a lot of disparate systems — be they testing systems, automation systems, performance systems, compliance systems, risk systems, security systems with all these things. They are all happening independently. So you get a technology piece, the technology from conception through to production. You have to pass through so many gates. It’s not just about writing the piece of code and making sure it does what it is supposed to do. There are a number of other pieces to that process now.
All these pieces used to be independent. So you get your reports from the security team, ‘Yup, it’s now compliant’. You get the functional people saying ‘it does what it’s supposed to do’. The performance people, the risk people, all these people are assessing this through the life cycle. They would compile these things individually and project managers would bring them to steering committees and people would read through weighty tons of documents. At the end of all that some summary goes up.
That’s got to happen almost in real time and must be integrated and those reports need to go up more in dashboard styles, asking, ‘Can I proceed to the next phase of my sprint?’ Boom, boom, boom, yes, yes, yes. That’s the challenge you’re facing to get things working today if you want to work fast.
How the merging of disparate skill sets will define the QA of the future
Some time ago, there were a lot of conversations about the magic that comes together when disparate skillsets meet. I believe that is definitely playing out in technology today. The days of saying ‘I’m just a developer of this’ or ‘I’m just a tester of that’ or ‘I’m just a technology compliance person, I only need to understand compliance. The need to understand the life cycle and how things work in order for you to fulfill your part of the jigsaw puzzle is key.
So what will become the skillset of a tester in the future? I think it will be very different to what it is today. It’s pretty much the same as what it’s going to mean for people in technology roles going forward.
And we see that changing all the time. You constantly see the reports asking, ‘What’s the changing role of the CIO? What’s the changing role of the tester? What’s the changing role of the developer?’ They will morph into other things.
They will morph into other things due to all of the obligations on everybody. It used to be you would just have to be a technologist and manage an asset. Now they need to be semi-legal because they need to understand the legality around the contracts they’re supporting with the people they are leveraging.
They need to understand their APRA obligations in terms of what they are supposed to be delivering from what they are held accountable for. They need to understand all of the nuances around security vulnerabilities and how they’re going to maintain their assets to ensure that they meet those conditions.
And they also have to understand application development, integration testing, and performance. Everybody’s world is becoming far more complicated and far more integrated as a result of where we’re trending, where we’re going.
QA will definitely continue to exist, but it’s going to become more complicated because that team won’t just be providing the assurance over the functional aspects of it. It will include the non-functional aspects and all the other dimensions that have to be accounted for. They will be looking to the assurance people and saying, ‘Prove to me that we have assurance done on everything’. It’s not satisfactory to say, ‘Part of this is good.’
The leader of that assurance team has to account for the whole lot, and they are going to tell the team to go out there and assure that all obligations are being met — not just the traditional paths it used to assure.
For additional insights from quality leaders, read the complete 100+ page Capgemini report, Reimagine the future of quality assurance.