Skip to content

Q&A: Data analytics leader on skills that will outlast the AI revolution, breaking into the field, and what it takes to succeed

A Tricentis data leader explains how AI is reshaping analytics careers—and why SQL, judgment, and business context matter more than ever.

Apr. 07, 2026
Author: Annie Millerbernd

Key takeaways:

  • Effective communication and stakeholder engagement are crucial skills to succeed as an analyst.
  • Even amid AI advancements, technical fundamentals remain the backbone of data professions.
  • While AI can accelerate routine and repetitive tasks, it cannot replace critical thinking and business context.

Data analytics and science professions have undergone a dramatic transformation in the last decade. Demand for data talent has risen steadily, but the skills required to succeed in the field have evolved right alongside it.

And now AI is a defining variable shaping what the next decade of analytics work will look like. It speeds up routine tasks, gives data access to more business users, and raises new questions about what skills will matter most in the not-so-distant future.

For some, that prospect is exciting; for others, it’s unsettling. But as Tricentis’ senior data analytics and science manager Tiarnán Stacke explains, the impact of AI is more nuanced than the headlines suggest. Far from erasing the need for technical fundamentals, he says, AI actually makes them more important.

Stacke’s years of experience across go-to-market analytics, data science, and business intelligence, has equipped him with a strong perspective on the future of data analytics. We sat down to talk about how his own career evolved, what AI has and has not changed, and what people who want to break into the field should prioritize. Spoiler: using Claude probably won’t be enough to get you hired anytime soon if you can’t write SQL.

Answers have been edited for clarity and length.

Question: Tell me about your journey into and through the profession.

Answer: I started my career in tech support, so not something that’s traditionally data‑focused. It was hands‑on technical work — scripting, log files, understanding SQL back ends of what was driving our applications. It was enterprise software, so primarily dealing with IT professionals in customer companies. That gave me a deep technical understanding of system architectures and how end users interact with tools.

From there, a strong leader in business intelligence gave me the opportunity to move into the data space as a business intelligence analyst. That opened the door for me to work with data across the business — support, IT, and eventually the full go‑to‑market funnel: sales, renewals, marketing.

Today at Tricentis, I cover strategic go‑to‑market analytics and data science. My team sits at the intersection of marketing, sales, and customer success. In combination with our business intelligence team, we build models, perform analysis, and build insight frameworks that help the business make better decisions across the full revenue cycle.

Q: What has kept you in data all this time?

A: It keeps being interesting. I like being close to the business, understanding what truly has impact, and being able to identify when data points might be misleading because they’re entered incorrectly or because there are hygiene issues. Understanding what’s correct but not insightful, and what’s decision‑worthy — that unpacking and building of relationships is what has kept me in the space.

Q: How does your career look different now compared to what you expected?

A: I spend a lot less time worried about technical delivery than I used to. Experience builds familiarity with systems and architectures and how data is generated. That really stops being the hard part. Early on, you handle things transactionally — someone needs something, you figure out where the data lives, you build it, and you deliver it.

The role shifted to being more critical and proactive. Now I spend a lot of time looking at the business, seeing gaps, knowing when the data isn’t showing the full picture, and identifying what questions aren’t being asked that should be. At the start, I thought the job was answering questions, but it’s actually more valuable knowing what questions are worth asking.

Q: Let me ask you the question everyone seems to be asking each other right now: Are you worried AI will take your job, or do away with entry‑level analytics jobs?

A: Personally, I’m not super concerned. Everyone has brief existential crises with AI because the outputs are convincing, but the output really does depend on the context you give it, and in analytics the context is the hard part.

That’s not even a knock on AI, it’s just that the person asking the question needs to know what a good answer looks like. When a stakeholder asks for data, we’re here to understand what they’re looking for, what question they’re actually trying to answer.

I think hiring will definitely slow down for entry-level roles, and we’ll see an issue of not having many people with a shorter tenure. Especially early on in your career, it’s a lot of foundational work, and there is absolutely a risk of more senior contributors shifting that work to AI.

From my perspective, knowing how to interpret AI outputs and being able to read and understand what it’s doing is the key bit of context. As in, it could generate a query that runs, but do I understand exactly what it’s doing? How is it tying the tables together? Why is it doing that? I think that will be really, really important as a junior to make sure you have that grounding and that understanding.

Webinar: How to prevent bad data from undermining your AI implementation

Q: How has AI changed your day-to-day work?

A: It has compressed the low end of the work. Things that took hours and were repetitive or monotonous can now be accelerated, whether that’s just generating queries or pulling together high‑level wireframes, that stuff is so much quicker to get out of your brain into something that’s 80% the way there.

But it also highlights the importance of understanding how your data is actually generated in the first place. AI is really good at producing something convincing, but convincing and correct aren’t the same thing.

Ad‑hoc dashboard‑style analysis is changing. More and more business users will continue to try to use things like ChatGPT or Claude to get those answers, which is good and interesting. But analysts need to understand how we can make that process better. That could mean how the users get to the data clearly, how we provide the agents enough context so they can actually provide that data clearly.

With that, though, the pitfalls absolutely multiply. Poor integrity of source data, unfiltered data, metrics pulled without context – that becomes more and more of an issue as the user is able to query an LLM ad hoc. That translation absolutely isn’t perfect. It’s always getting better, but I still think it does have a ways to go. But understanding where those gaps are is a critical part of what analysts do.

Webinar: Why data integrity is critical to the success of your AI initiatives

Q: Does the thought that most of your coworkers will be able to query the data in natural language concern you?

A: It’s a serious concern that we’re trying to get ahead of. Users may not know what they’re actually asking or how to interpret the answer. They can be led down the wrong path if they fed data to the model without understanding the data’s context, or it came from a dashboard that’s pulling metrics a certain way. Eventually, someone will notice something looks wrong, and it will come back to analysts to fix.

I think there will be a pain period while users figure out what AI can and can’t do because it does produce such a convincing output. I think analytics and data infrastructure teams are currently trying to figure out how to enable context-heavy AI to provide the right answers, because people aren’t going to start using them.

Q: What skills will always be essential in the future where business users can query data directly?

A: Judgment about what actually matters. Knowing the difference between where precision matters and where directional accuracy matters. AI is really good at surfacing patterns and accelerating delivery. You throw it files, and it will find trends that it would take an analyst a long time to identify, but it’s not going to tell you whether a metric is worth caring about and how that’s strategically relevant to the business.

There is a real skill of knowing when that exactness is required. Take something like financial reporting: You need to tie that out to the cent. It has to be precise. But if you’re looking at something like a marketing attribution framework, it’s never going to be 100% accurate; it never has been. As an analyst, you understand the business well enough to understand the type of problem you’re solving, and that’s a human call, ultimately.

Q: What problems is AI reliably good at solving today, in your opinion?

A: First‑pass analysis, summarizing structure, and organizing I find being able to brain dump a stream of consciousness is really helpful. As in, I can have a lot of thoughts on a topic and have a single chat open where I just dump notes and then at some point ask AI to help me format everything into distinct topics. That saves a ton of time.

You can also pressure‑test arguments by asking AI to play the role of a stakeholder and ask likely questions for an upcoming presentation, for example.

Q: If you were breaking into analytics today, what would you do differently?

A: I would have pushed more and shared my opinion more earlier in my career. Analysts often think the data will speak for itself, but not voicing an opinion can be a disservice. Building fundamentals is important, but when you have stakeholder trust, it’s okay to recommend something.

The difference between producing data and producing analytics versus driving insights is offering a point of view.

Q: For people who want to ramp up a career in analytics, what hard technical skills should they prioritize developing?

A: SQL is still the building block of any analyst. It gets you close to raw data, lets you interrogate it, reshape it, validate it, and catch when something doesn’t look right. Understanding data modeling — how systems relate and why a number is wrong or correct — is essential.

Another technical skill, though it doesn’t feel technical, is understanding how your data is actually created. Sit with accountants. Shadow sales reps. Spend time with campaign marketers. The strongest analysts understand exactly how humans generate the data that ends up in systems. Systems never tell the whole story.

Read more: The rise of AI agent sprawl: Why data integrity is your first line of defense

Q: What non‑technical skills matter most?

A: Knowing your audience is always going to be a key nontechnical skill for an analyst. Always lead with the “so what,” rather than the methodology. Early on, most analysts present findings in the order they discovered them. If you’re presenting to an executive, for example, you want to give them the conclusion and give them the evidence if they ask for it. Then, tell them why they should care – what business decision making this will influence and what impact it’ll have on the business.

A trap we can fall into as analysts is that we can spend weeks working on a single topic, weaving together all these distanced threads and then the outcome starts to feel really obvious to you. But remember that not everyone you’re presenting to is living in the minutiae of the detail all the time.

Q: Are there skills analysts needed today that didn’t used to be as important?

A: Proactively surfacing insights and having the conversations that make those possible. As AI gets better, users are less likely to come to analysts. That reduces informal conversations where pain points come up. Analysts must deliberately maintain those touch‑points.

AI literacy — knowing how these tools work, how to validate outputs, and how to use them without introducing noise — is increasingly important.

Understanding how to apply automation to test and validate increasingly large data sets is also a valuable skill. AI can introduce what I’d call silent transformation errors. These are hard to detect precisely because the output still looks plausible. Building testing into your pipelines is one of the best safeguards you have against that, and becomes more critical with AI as you spend less time in the minutia of the data.

Q: Are there skills entry-level analysts should not spend time on?

A: Don’t build reports no one asked for, hoping someone will notice. Early in my career there was a “build it and they will come” mentality with dashboards. AI makes producing something that looks like analysis easy, so producing something that actually matters is more important than ever.

Q: What helps someone move from service‑desk analytics to a strategic role influencing decisions?

A: Understanding what decision the data is meant to inform. Ask what the stakeholder will do differently if the answer is X or Y. Be in the room where the data is discussed. Early in my career I’d see weeks of work reduced to one bullet point. It causes impostor syndrome but teaches the gap between producing data and driving insight.

Analytics has always been about more than the data. It’s about the questions you ask before you start, the relationships you build to understand what the data isn’t telling you, and the judgement to know what actually matters when you find it. AI changes a lot of things, but it doesn’t change that.

Data integrity testing

Learn more about driving better business outcomes with high-quality, trustworthy data.

Author:

Annie Millerbernd

Senior Content Marketing Specialist

Date: Apr. 07, 2026

Data integrity testing

Learn more about driving better business outcomes with high-quality, trustworthy data.

Author:

Annie Millerbernd

Date: Apr. 07, 2026

Recommended

You might also be interested in...