Blog

How cognitive biases influence software testing

  • 14 September 2023
  • 0 replies
  • 79 views
How cognitive biases influence software testing
Userlevel 4
Badge +2

What I if told you

…You read that line wrong?

That’s right, take a closer look.

Is this simply an optical illusion? What’s happening in our minds when we fall for these tricks? We breeze past mistakes in text, like the example above, yet we understand perfectly what the sentence means.

We often take mental shortcuts, or heuristics, that can ease intellectual burden and allow us to make decisions quickly and efficiently. Heuristics often lead to cognitive biases, which can in turn lead to errors in judgment — something we want to avoid, especially when building and testing software.

Here, I’ll give a little background on the psychology of this anomaly, expose some common cognitive biases I have experienced in load testing, and offer some practical ways to avoid them.

Biases, biases everywhere

A leading study from a Google research group found that:

For 70% of the mobile landing pages we analyzed, it took more than five seconds for the visual content above the fold to display on the screen, and it took more than seven seconds to fully load all visual content above and below the fold.

What do you think is an acceptable response time for your own website? Is it less than five seconds? Less than seven seconds? Was your answer outside this range? If so, congratulations — you avoided one of the most common cognitive biases in load testing.

The anchoring effect is the tendency to rely too heavily on an initial piece of information, known as the anchor, when making decisions. In the case of the thought experiment I just described, respondents are more likely to give an answer in the five to seven second range. If no anchor is offered, one might choose an entirely different range than Google’s.

Do you also see Google as an authority when it comes to this kind of information? If you do, you are susceptible to a social cognitive bias known as the authority bias. This occurs when we overestimate the legitimacy of the opinion of an authority, and are therefore more likely to be influenced by said authority when making decisions.

Cognitive biases are rooted in the field of psychology but impact our day-to-day work. Knowing they exist and familiarizing ourselves with them can help us avoid costly mistakes in our performance-testing endeavors.

Thinking: fast and slow

Why are we subject to these biases? Simply put, the phenomenon boils down to the interplay between two modes of thinking: Fast cognition, our effortless, almost reflexive thoughts; and slow cognition, our more deliberate, purposeful thoughts.

Fast cognition is more susceptible to cognitive biases — our personal assumptions and predispositions can easily infiltrate those automatic thoughts — yet biases can still be introduced as systemic errors in our slow cognition. Each system of thought overlaps and cooperates with the other, and neither is immune to bias. This is why it’s so critical to develop strategies for identifying and avoiding cognitive blunders.

Common biases and how to avoid them

Hundreds of different biases have been recognized in the field of psychology. I can’t cover all of them in this post, but here is a short list of biases that I commonly encounter in load testing, along with tips on how to deal with them.

Anchoring effect

The best way to counter the anchoring effect in load testing is to always visualize your data.

Demonstrating the importance of visualizing data, the Datasaurus is a popular adaptation of the Anscombe’s quartet, which, when visualized, reveals a dinosaur in the scatter plot, despite the summary statistics having nearly identical values.

Authority bias

Authority bias is often exacerbated by the belief that obedience constitutes correct behavior. This is a good example of how the systemic nature of cognitive biases can affect our slow cognition. As software testers, it can be especially difficult for us to question authority.

To counter this bias, identify the assumptions you make when forming hypotheses, and question if any are made with deference to an authority. That authority might be the specification document, the organization you work for, a recognized arbiter like Google, or perhaps even yourself. You should question the authority in all cases, rather than mindlessly deferring to a potentially biased opinion.

Causality

Be aware of the logical fallacy, “After this, therefore because of this.”

Consider the site reliability engineer who observes a spike in database write throughput daily, followed by a brief website outage in the early hours of the morning. The assumption might be that, since the outage follows the database spike, the outage is cause by the spike — in other words, a correlation appears to suggest causality. I often liken this error to chasing symptoms rather than the root cause.

A seasoned engineer can avoid this bias by considering other factors that could potentially be responsible for the results in question, thus ruling out a false connection.

Availability heuristics

These are mental shortcuts that tend to rely on anecdotal examples. For instance, “this has happened twice in production,” or, “the last time we changed this, it fixed the problem.”

These shortcuts can be countered by deliberately engaging your slow cognition. I like to use the following mnemonic device when diagnosing production performance issues under pressure:

  • Stop and lessen the immediacy of a solution.
  • Think about the problem in the context of what’s changed.
  • Observe the things you can measure, and what needs measuring.
  • Plan your next test, making small, observable changes.

Confirmation bias

This is our tendency to search for and recall information that confirms our existing beliefs or hypotheses. A classic example of this in performance analysis is, “this component has never been a problem in production, so we know it is not the root cause.” The correct conclusion would be, “we expect this component not to be root cause.” We can avoid this bias by changing the language we use to define problems.

Focusing less on being right and more on what can go wrong with your tests will help you avoid confirmation bias in your decisions and conclusions.

Inattentional blindness

This occurs when we fail to perceive unexpected results that are right in front of us. Many of us have experienced the frustration of spending hours on a performance problem, only to arrive back at the start of the investigation having missed some simple, obvious detail, like a database configured with the default maximum connections or an operating system limited to the default 1024 file descriptors.

I find a good counter to this bias is simply to “sleep on it.” Delay making a decision until the following day, or least after lunch. Often, after a night’s rest or a spell away from the keyboard, I return to my desk and immediately arrive at the answer to a performance problem.

Ask yourself, “Am I missing something?” and don’t be too quick to jump to conclusions. There is value in establishing your own mental checklists, since little mistakes, like misconfigured settings, often compound into larger problems. Using checklists helps alleviate the cognitive burden on our slow cognition and gives fast cognition more space to work, with less susceptibility to cognitive bias.

How to ease the burden

Those of us who work in software engineering and related disciplines are bound to encounter a range of cognitive biases on a regular basis. The strategies I’ve outlined can help mitigate the threat of these lapses in judgment that can lead us to draw erroneous conclusions, but it’s also important to choose software that helps alleviate cognitive burden without the hindrance of cognitive bias.

Tricentis’ continuous testing platform offers end-to-end insights for software testing with every release, and enables us to make better decisions about software quality. The most powerful tool we have when it comes to testing software is brainpower, but good backup never hurts.

***

This article was originally published on the Atlassian Blog.


0 replies

Be the first to reply!

Reply