Wednesday, January 25, 2012

My "Testing Process": A Meandering Walk Through Twenty Years of Software Testing




"You take the blue pill - the story ends, you wake up in your bed and believe whatever you want to believe. You take the red pill - you stay in Wonderland and I show you how deep the rabbit-hole goes." -- Morpheus (Laurence Fishburn), The Matrix


Yes, you are getting two posts from me for this series today :).

I'm taking a bit of a diversion, in the sense that I was asked what my testing process was during the past.


I could post a glib answer and say, in the truest context-driven manner, "well, it depended on the situation". Honestly, I wish I could say that and mean it, but for 85% of the time, I'd be lying.


I wish I could say "I used a naturally developing heuristic model that examined different aspects of a product based on stakeholder values". Yeah, sort of, kind of, but that's on reflection of what I know today. I wasn't doing that cognitively, or with any sense of awareness at the time.


I'd love to say "I used an exploratory model and dug into the various areas of the product to learn about its capabilties". Meh, sorta'... but again, I wasn't aware that I was doing that. In hindsight, of course, I did, a little, but I had no vocabulary to describe it in those terms.


So what was my testing process? I hope you are ready, 'cause here it is!

In black and white, from 1991 until 2009, the Michael Larsen Testing Process was:


1. I diagrammed test steps based on what I was told the feature was.

2. I performed those test steps.

3. If it looked OK, I moved on.

4. If it didn't, I filed a bug, and then followed up to check if the bug was fixed later.

5. Repeat this process manually until I get so sick of doing it that I figure out some way to get the computer to help me run the same steps (computer aided testing, automation, call it what you will) either by myself, or more often, with the help of developers or automation testers who had that job.

6. Repeat 1 through 5 for 18 years.


TAA-DAA!!!


Isn't that sad? Isn't that pathetic? Shouldn't I feel ashamed of myself?

Yes, Yes and YES!!!

I'd feel even worse were it not for one very large lamentable fact... that's 90% of what testing *IS* in most organizations out there. That very testing process I just described is what most testers do, and if you don't, it's because you have "taken the Red Pill", and decided to see how deep the rabbit hole goes. You have woken up and recognize what Michael Bolton calls "checking" and what James Bach calls "fake testing". I called this "my testing life" for 17 of those 20 years.


Generally speaking, I have worked for command and control organizations over the years. By design or by tradition, that's just where I tended to work. When I was at Cisco, ISO 9001 certification demanded it, so we complied. We wrote voluminous test plans, we spelled out everything we did, we scripted every step, we made matrix tables and checked off columns, and we did it over and over and over again. At Connectix, we did what we could to be compliant with Apple and Microsoft, so we made test plans, we spelled out our steps, and we did them again and again. At Synaptics, we wrote up proposal documents and we wrote out everything we did for regulatory and manufacturing compliance, and we followed the plan because we had to as part of our contract obligations with hardware manufacturers. At Konami, we had to be very specific with our test steps and the way we reported bugs because we had to make sure two very different languages (English and Japanese) could communicate effectively. At Tracker, because of legal requirements and working in the legal industry, we had to show our test plans and what we were actually doing and stack up books worth of test evidence to show what we knew, when we knew it.

It was only in my last year of working at Tracker I started to say "now come on, there's got to be a better way than this!" In the Fall of 2009, I took my Red Pill, and I decided for myself "I want to see how deep the rabbit hole goes".


Only now, at SideReel, am I working with a company that has basically said "stop with the voluminous test plans. You don't need it. The story is the spec. The story is the test plan. Go outside of the boundaries to test, look at other options. Feel free to approach your testing from different perspectives. Go and have a play. Learn from the system and adapt your approach. Be open to new information from unusual sources. Be creative in your thinking. You have our permission, and frankly, we'd hope you would do it even if we didn't give you permission!" Other organizations hinted that that was what they wanted, but they had processes in place that implicitly instructed otherwise. Sidereel's Agile approach, and really trying their best to mean it, has been a chance to "work and think differently", to actually do the things I mentioned in the first couple of paragraphs.


Now don't get me wrong, there's still a fair amount of "checking" that happens here, too. I don't not write test plans, but I handle them in a different way, and I'll be getting to that when I get to my discussion of my testing journey when I actually get to the SideReel chapter of the story. Sorry to say, y'all will have to wait just a few more days. I hope that's OK :).

No comments: