Monday, September 16, 2013

Reduce Biases: 99 Ways Workshop #83

Reduce Biases: 99 Ways Workshop #83

The Software Testing Club recently put out an eBook called "99 Things You Can Do to Become a Better Tester". Some of them are really general and vague. Some of them are remarkably specific.


My goal for the next few weeks is to take the "99 Things" book and see if I can put my own personal spin on each of them, and make a personal workshop out of each of the suggestions. 


Suggestion #83: Reduce biases & unintentional blindness. - halperinko


We all come into interactions with others with the world view that we carry. This isn't said to criticize, it's a fact of life. Each one of us have an exquisitely developed, and lifelong designed, map of the world. Every experience we have had, or will have, will etch lines, footnotes and guideposts to that map. However, just like traveling and having experiences can color our perceptions about where we go and what we will do based on those experiences, so our mental map is colored by where we have been and what we have seen in the past. At the root, this is where bias comes into play. 


Bias is seen as a nasty word. We recoil automatically when we hear that someone has a "bias". We automatically associate it with much larger issues, like bigotry, racism, sexism, etc. and for many, they see them as one and the same. They are not, but that doesn't mean that they are still not detrimental to providing a more objective view of situations, and our understanding of them.


Inattentional blindness is something that affects all of us, and we like to think that we can see it when it happens. Many of us have seen the video with the "Moonwalking Bear", but it goes beyond that. At a point, cognitive overload takes place, and the fact is, we will miss something, no matter how well tuned we are to "see what others don't see".


Workshop #83: Study up on bias and how it can affect you. Look for ways that you can "un-bias" whenever possible. Consider avenues and ways of examining a product that can limit inattentional blindness, or at least help make one aware of it. 



Bias is a huge topic. There are numerous biases that we can fall into without even realizing it. The most typical biases that will affect us are what are referred to as the "cognitive biases".


Below are some quick examples of common cognitive biases that come into play when testing software.




- Attention Bias - we focus on something to the extent that we miss seeing something else (i.e. inattentional blindness).

- Confirmation Bias - we see things in a way that tends to reinforce our own views.


- Consistency Bias - if something has happened before, we think it's likely to happen again.


- Distinction Bias - we tend to see things as desirable when viewed together vs. when they are viewed independently.


- Illusion of Control - we create a mental model of how things work, and we get enough confirmations over time that our model is "correct", even if it is not. This becomes an issue when we try something and we do not get the results back that we expect. Our entire model can fall apart at this point, and we'll need to start again.


For a much larger list of biases, check out the Wikipedia article here


Understanding biases is important, and can be the strongest aid in making sure we don't fall prey to them in the first place. It's important that we challenge our beliefs, that we try our best to stand on "scientific ground" wherever possible, that we look at situations dispassionately, and that we consistently ask questions and do all we can to avoid assumptions.


The "illusion of control" is one that can hit us at any time, so vigorously try to see if you can disprove an idea, rather than seek to prove you are right. If you are operating on a faulty mental model, this will likely help you find out faster than if you try to test things that support your model.


If we are concerned about attention, we need to practice the ability to shift focus, or de-focus, to see a larger picture. We need to remind ourselves to change focus within an app and not get too narrow, or consider too many steps in their "perfect path" framing. We also can benefit by seeing how many systems are interacting at any given time.


For Confirmation Bias, make a personal rule that each avenue you choose to explore, you will consider an alternate and opposite view. If you are asked to see if something works, start out with trying to see how to make it fail. If you have a belief that something will happen if you do X, then add Y into the mix and see if the outcome is the same. Deliberately try to discover other avenues where something may be used, and steer clear of "mandates" whenever possible (some mandates you may have no control over).


Bottom Line:


Bias is everywhere, and we all fall prey to it from time to time. We can't completely wipe bias out of what we do, but we can recognize it exists, and we can ask ourselves frequently if we are potentially being influenced by it. By paying attention, and considering that we might be missing something, we can then counteract it, and stop it from progressing. It takes time and practice, but awareness is the first step, so become aware, and then act accordingly :).

No comments: