Wednesday, May 28, 2014

Listening to a Cowboy: Live at Climate Corp, It's BAST!!!

Hello everyone, and sorry for the delay in posting. There's a lot of reasons for that, and really, I'll explain in a lengthy (or small series) of posts exactly why that has been the case. However, tonight, I am emerging from my self imposed exile to come out and give support for Curtis Stuehrenberg and hist tall about "ACCellerating Your Test Planning".

From the BAST meetup post:

"One of the most pervasive questions we're asked by people testing within an agile environment is how to perform test planning when you've only got two weeks for a sprint - and you're usually asked to start before specifications and other work is solidified. This evening we plan on exploring one of the most effective tools your speaker has used to get a test team started working at the beginning of a sprint and perhaps even earlier. We'll be conducting a working session using the ACC method first proposed by James Whittaker and developed over actual practice in mobile, web, and "big data" application development."

For those not familiar with Curtis (and if you aren't, well, where have you been ;)? ):

Curtis is currently leading mobile application testing at the Climate Corporation located in San Francisco, Seattle, and Kansas City. When not trying to help famers and growers deal with weather and changing climate conditions he devotes what little free time he can muster to using his 15 years of practical experience to promote agile software testing and contextual quality assurance at conferences like SFAgile, STPCon, ALM-Forum, and CAST as well as publications like Tea Time for Testers and Better Software magazine.

This is an extension of Curtis' talk from the ALM Forum in April. One of the core ideas is to ask "can you write your test plan in ten minutes? If not, why not?"

Curtis displayed some examples of his own product (including downloading the Climate Corp mobile app by each of us), and brought us into an example testing scenario and requirements gathering session. Again, rather than trying to make an exhaustive document, we had to be very quick and nimble in regards to what we could cover and in how much time we had to cover it. In this case, we had the talk duration to define the areas of the product, the components that were relevant, and the attributes that mattered to our testing.

Session Based Test Management fits really well in this environment, and helps to really focus attention for a given session. By using a very focused mission, and a small time box (30 minutes or so), each test session allows the tester the ability to look at the attributes and components that make sense in that specific session. By writing down and reporting what they see, they are able to document their test cases as they are being run, and in addition, show a variety of areas where they may have totally new testing ideas based on the testing session they just went through, and these in turn inform other testing sessions. In some ways, this method of exploring and reporting simultaneously allows for a development of a matrix that is more dense and more complete than one that may be generated first before actively testing.

the dynamic this time around was more personal and more focused. Since it was not a formal conference presentation, the questions were more common, and we were able to address questions immediately rather than waiting until the talk was finished. Jon Bah's idea of threads was presented and described, and how it can help capture interesting data, but help us consciously stay "on task", yet capture interesting areas to explore later (OK, I piped in on that, but hey, it deserved to be said :) ).

It's been a few months since we were able to get everyone together, and my thanks to Curtis for taking the lead and getting us together this month. we are looking forward to next month's Meetup, and as soon as we know what it is (and who is presenting it ;) ).

No comments: