Yesterday I had one of those interesting experiences. When I say "interesting", that's often a euphemism for "disquieting discovery that I need to do something better". Yeah, that's exactly what I mean by it.
Case in point. We had a convoluted story with a lot of test examples we had to go through. We raked this sucker over the coals, and we found all sorts of stuff that was fixed, refactored and, by one of our developer's admission, "it took me longer to explain what we did than it took me to write the whole thing". It's a good bet that, in those circumstances, you may find that there are holes. I was prepared for holes. I wasn't prepared for the question I did receive, though...
"So, how can I make a clear and specific choice that shows the effect of this change, and in a way that I can explain it to a customer?"
It seemed so simple, such a clear and easy thing to do... and yet, I realized that I was struggling with this, because our Quality Director, as he walked me through this particular scenario, kept showing me things that would effectively get in the way of doing what we wanted to demonstrate (browsers with the way they cache and store items, the differences between browsers, etc.). Each time, I was starting to feel my confidence fade a little bit. I went from thinking I was very confident that we were on good ground with our testing to feeling rather uneasy. What did we miss?
It wasn't what we missed, but what we didn't articulate. The developer and I did so much back and forth with the story that we, at the time, knew we'd run the story through its paces, multiple times. What I didn't do was give enough of an indication of what, specifically, could be told to our support engineers so that, in the event of explaining the story changes, they would easily be able to explain the pros and cons to our customers.
I mention this because, sometimes, there's a danger we'll lose the balance between being lean and fast, with "just enough documentation" to be effective, and explaining enough so that those who see stories after we have finished them have the right amount of information to likewise be effective. The key takeaway from this experience was to say "can you, in a short but succinct way, explain to anyone who reads this story days, weeks or months after the fact, exactly what it is the story is accomplishing, and how to, on the system, show exactly what a change is doing? If you can do that, then you can go lean and mean with documentation. If you can't, then adding more and more documentation will probably not make the story any more understandable". This hearkens back to my comments a few days ago about "implicit instructions". We know what they are, the developer may know what they are, and the Product Manager may also know what they are. If, however any key individual in that line and beyond does not know, then there's a problem, and it can be exacerbated the farther away from the implicit understanding we get. It just goes to show that even veteran testers may have to do a gut check every now and again to make sure that they can simply and directly explain what something does, and be sure that it really does do what we think it does.