Thursday, April 25, 2013

Final Day of #STPCON (Live Blog)

Today is bittersweet. It's the last day of STP-CON, and the last day of what has proved to be an intense, interesting, fascinating, and thoroughly enjoyable few days. I love the opportunity to get together with people from different industries, experiences, and world views and learning about what works and doesn't work for them, as well as getting advice about what is and isn't working for me. Sleep tends to become a very limited resource, but that's because the time spent after talks and workshops conferring with others is what makes these events so valuable.

If I can make any one piece of advice for anyone who attends a conference, make sure that your emphasis is on the "conferring", and understand that most of the real learning, breakthrough moments and epiphanies will not be in Q&A periods in the closing minutes of a presentation, they will be in laughter and discussion over Red Sauced Pork Barbecoa at a wonderful Mexican restaurant, or watching your conference mates look at you with both exasperation and understanding as they fall prey to another round of "The Pen Test".

Yep, this morning will be the closer, and more to the point, I will be the closing speaker. Well, OK, not exactly the closing speaker, but I'm in the last track session slot along with four other topics. Part of the dynamic today is going to be that the back of the room will be filled with suitcases and duffel bags and people who may be asking me subtly with their eyes "ummm, are you gonna' wrap up a little early by chance, because I've got a flight to catch!" My answer is "I'll do my best, but I hope that I can offer something worthwhile enough to make you want to stick around at least until the end of the session. For obvious reasons, I will not be able to live blog my own session (maybe someone else will help me do that via Twitter and I'll pull in their comments, but I will post a synopsis after I'm done.

With that, I need to go and check out, as well as get set up for the closing three speakers I will be attending. Lynn McKee and Matt Heusser, y'all best get ready :).

Oh, a shout out to Mike Himelstein, a new friend from Atlanta. He's been drawing little sketches of the attendees and speakers, and he shared his "vision of me" while I've been here...



I think I have a new avatar :).



-----

And we're back. Breakfast is over and we are now getting underway with Lynn McKee. For those who don't know Lynn, I'm happy to count her as both a terrific colleague and a great griend. Lynn invited me up to facilitate the POST 2012 peer conference in Calgary, and additionally, she helped me complete a bucket list item by taking me snowboarding at Banff. Oh, and she's also an excellent Agile Coach and team mentor, which is part of why she's bale to speak here and now. I'm excited that Lynn is getting a Keynote, though I must say I'm slightly sad that her "wonder twin" isn't here to see it (Nancy Kelln, we miss you!).


Change is easy, right? Sure it is. Oh, you mean effective change, one that is internalized by your team and organization. Yeah, sorry, that's hard. And frequently unsuccessful. Why? Often, it's because there are conflicting goals and visions. Often it's the blind leading the blind. "We don't know where we're going, but we're making great progress!

Some people have an appetite for change. People who attend conferences, talk at them, stay up late talking about testing and gadgets, these are all likely early adopters or at least have an understanding or appreciation for change. Others have a different appetite or desire for change, ranging from totally willing to completely unwilling.

For change to happen, we have to create a sense of urgency. For those of us who are testers, what are we willing to do to create that sense of urgency? Ultimately, we need to show the business value to the organization as to why testing not only matters, but is vital. We also need to show what we can bring to the change and what our value truly is.

Additionally, there needs to be a coalition of the willing, and perhaps even the fool-hearty. There needs to be expertise in this group so that the change can happen. there also needs to be credibility, and someone willing to run up San Juan Hill or be one of the Light Brigade (hopefully with the results closer to the former rather than the latter). 

Testers can be transformative by daring to place themselves in the cross hairs. We may need to prove we are willing and able to do what it takes to gain trust, or if nothing else, scare everyone else so bad that the treat us like Oda Nobunaga's solders who walk through the forest with their match locks lit (Pete, that one's for you ;) ).

Key influencers are important.  They don't have to necessarily be the visible ones, but if you can get those in the organization to believe in your view, and if they believe that what you are striving to offer will bring real value to the organization, that could be the catalyst to make it all happen.

Making testing great is a wonderful goal, it's a terrific vision, but what doe it ultimately offer to the organization? If we cannot answer that question, then we are not likely to make headway, even if they can agree that "better testing" is a wonderful goal. Better testing for what? If it's just for the sake of kudos, or for team cohesion, neat goal, but maybe not compelling enough. We're a cost center. Tester's don't make money for an organization. Sorry, but unless you are selling test services as your product, no testers at a company actually make cash for the company. What we do provide, however, is a hedge. We can safeguard revenue earned, and we can prevent revenue erosion. Make no mistake, that's huge! What is the value of a thoroughly tested product? It's hard to quantify, but there's no question what the value of a 1 or a 2 rating in a app store is. THAT is something that the business can understand!

So what are the obstacles that can get in the way of change? People can get in the way. Sometimes WE are those people. Processes can get in the way. Sometimes they are well intentioned but pointless. Sometimes other people's perceptions of us get in the way. Sometimes the technology stack we use can be an impediment (Rails is a great framework, but if your entire infrastructure and product is developed for Win32, that may be a real problem to change.

Buzzwords abound, and often the buzzwords start flying with little understanding as to what they mean. Avoid this. If you use a buzzword, make sure that you a clear on what it is, what it means, and what it's really going to provide. More to the point, what does the buzz word add to the bottom line? Also, if you are going to use mnemonics, make sure that you have spent time to show what they are, what they mean and how they are used. Heuristics are valuable, but if we cannot communicate what they help us do, no one will care or invest the time to make it work. Slogans, jargon, call it what you may, make sure that you are clear as to what they are, what they mean, and what they do.

One of the beautiful tools that Lynn mentions, and I believe strongly in them as well, is the "quick win". Erasing Technical Debt is a lot like erasing Financial Debt. It's a big challenge, and it's a long hard slog. To make things happen, and to build heart and morale, there needs to be early wins and quick wins. Pick of a small problem and solve it before taking on the Colossus. those of us who play video games understand we battle small fry first to level up so we can take on the level bosses later. Short term wins give us strength, flex out muscles, and give us confidence to take on bigger problems.

Trim the fat where you can. Don't focus on processes that hinder you, work on the activities that get you results. Learn where "good enough" really is. Snazzy looking docs are nice, but if you are spending all your time making snazzy looking docs, that is time you could be doing real and valuable testing. Learn what really matters for reporting, and provide just that. I dare you! See what would happen if you gave a trim, slim, solid summary of what you have done an what you have found. See what happens. Will you get yelled at because you didn't attach the cover page to your TPS report? maybe the first time, but if you do it and show that your testing is happening and you are finding really great insights for the team, I'll bet you that the process will change (and yes, I am willing to take that bet, at least for most organizations).

There is risk in every project, and there is risk in change. How much and to what level varies in organizations, but nothing is fool-proof. We all need to focus on and show that w understand the risks. We have to give the information that will either tell people that we are looking good or we are in significant trouble, or something in between. It's possible we may stop the train. It's possible the train will keep going. If we have made an impact and allow our executives to sleep well at night, then good going. It certainly beats the alternative.

Change and transformations are not easy. Not if it's going to stick. Not if it's going to really change hearts and minds. Not if it's going to clobber the bottom line. Not if the group doesn't have a long view. Some people will not get on board. Some people might actually leave over the choices made to change. The phrase "you can change your organization, or you can change your organization" really rings true. Sometimes, the change that is needed is that YOU may need to go elsewhere. Are you brave enough to do that? Do you have the will to do that? Are you willing to "fall on your own sword"? Some people are, but many are not. There is the process in forging steel for swords called "the refiners fire". To purify steel, you have to heat it, beat it, and then plunge it into water to cool and harden it. The steel goes through a lot, but the end result is a hard and strong metal ready to cut through anything. 

Lynn, thanks so much, you rocked this :).

-----


Now we are talking about "Where Do Bugs Come From" and Matt Heusser is going to be our ringleader. He's promised to make this different than all other presentations we've seen this week. Instead of a lecture, we're going to have a discussion that everyone can get into.

Often, when we go to conferences, we come back with lots of ideas and enthusiasm, but when we present our ideas, we often get blank stares, crossed arms, and reasons why we can't do that. that happens, so what can w do so that we don't get into that position.

The first thing we need to do is stop asking permission. Just do what you plan to do. Make an experiment out of it. Decide to implement whatever key item you want to do, and figure out how you can put in into play on the first day you get back to work. Don't ask permission. Just go for it. That's at least my plan ;).

It's easy to say we try to find bugs. Every program has bugs. Even "Hello World" has bugs, if you dig deep enough. So yea, finding bugs is important, but beyond the trivial, where do bugs really come from? In truth, they all come from us, i.e. people. They come from programmers, but that's really only a small part of the story. Hardware can actually cause problems, voltages can flip bits, actual insects can fly into relays (Grace Hopper's legendary "bug" was exactly that, a moth that got caught in the circuitry and short circuited something).  

Bugs are not just glitches in code. They can be glitches in requirements, glitches in behavior, glitches in emotions, and glitches in markets. Seems a bit over-reaching? Maybe, but really, what is a bug? It's something that doesn't work the way we want it to, or someone else wants it to. Can a product work exactly the way it's "intended" and still be a bug? Absolutely! If the CEO decides that the paragraphs in a legal disclaimer need to be reversed, even though it is written as intended, and they decide it needs to be changed, now, then yes. what was working as written can become a P1 bug by "royal decree". Actually, that's not the best characterization. The real reason it was a bug was because there was a hidden stakeholder that wasn't considered.


There are differences in the way that desktop, web and mobile display and process events. We have issues with usability and also with intuitiveness. There's also conditions that bring to light issues that you just won't find unless they are met. How does your app work with a nearly dad battery on a mobile device? What happens when you plug in the power cord to start recharging? how about while you are walking around between cell towers? Different environments can bring to light many interesting anomalies. If we are aware of the possibilities, we can consider them, and perchance find them.

Cultural assumptions often come into play. When localization becomes part of a product, then there's a real "lost in translation" issue that can come into play. Not lost in translation of requirements, but genuinely an incorrect translation. One of the most interesting things I've seen was when one of my blog posts was summarized on a site in Japan. When I translated the site from Japanese to English, were I not the one to have written the original blog post, I would not have been able to make much sense of what the original article was actually saying. The reason? Not that the Japanese article was wrong, but that the literal translation that was provided genuinely didn't make sense to me as an English speaker. The grammar was still Japanese rules of speech, which frankly just don't make sense in a direct word for word translation. Real localization efforts go deeper than that, of course, but it helps emphasize just how that can become an issue if we are not really doing our due diligence.

In addition to understanding where bugs come from, it's also important that we understand the risks that those bugs represent, and we then have to decide if we genuinely care about them. Not all bugs are created equal, and what bothers one group of people may not bother another group at all. It may be so esoteric that the odds of it ever being expressed is .0001%. Do I care with my facebook page? Of course not. Would I care with the computer that controls the landing gear on the airplane that's taking me home tonight? Absolutely!!!



OK, so we have risks. What can we do to mitigate those risks? There's lots of things. We can use prototypes to test ideas. We can manage expectations. We can iterate and examine stories in smaller increments. We can pair testers with devs. We can start early and test requirements. This leads into the "three amigos meeting" model (one that Socialtext uses actively with our kick-offs, I might add; Google "Three Amigos Meeting" and you'll find lots of stuff to look at :) ). The main takeaway for that is "bring the stakeholders together and make sure everyone agrees on the work to be done".


So some takeaways...

- Pick latest critical bugs in production 
- Map them to techniques to mitigate risk
- Discover what you are not doing enough of.

Oh, and just go and do this. Don't ask permission. You don't have to. Just delight them that you are doing it ;).

-----

And now it falls to me. The closer, the last man standing. I'm ready now to address the Lone Wolf, the Armys of One, the Lone Ranger... and hey, if you work with a team, this may still prove to be relevant, because all of us Lone Wolf it some of the time. What craziness will come out of my mouth over the next hour and fifteen minutes? I guess we will just have to see...





[Editors note: Michael is talking right now, but the written words coming your way from here are courtesy Chris Kenst].



Michael's talking about his background, about how he made the switch to an agile environment in January of 2011 and not so long ago moved away from being a Lone Wolf and now works with 5 other testers (for some reason he seems quite excited)!



He's been through the traditional waterfall approach, as a lone tester, and now as a part of a "wolf pack" he's working his way through an agile process. It's all about learning, exploring in small chunks and iterating. This involves Test Driven Development (TDD) to help design the code. The class is asked, does anyone think TDD is testing? No one raised their hands. It's not testing. Its a design tool. In fact testing (not TDD) starts at testing the requirements often during the kickoff meeting. During these 3 amigos meetings the testers ask questions about the requirements in order to understand how the developer interpreted it. 


In agile teams the whole team is responsible for quality. TDD during design to help identify the right design, acceptance tests are built, automation testing is done throughout the entire cycle. This can be a good and bad thing for the Lone Tester because it can dramatically help you cover the system but also requires a great amount of time. Always something to do.

Theres always a need for testers on an Agile team. Testing is always happening at all levels, which is a wonderful change (especially when used correctly), but there is no 'quality police' - especially not for the testers. This has changed the role for the testers, it requires a variety of skills like domain knowledge and technical competency to interact with the development team. According to Bret Pettichord: Agile Testing is/are the: "Headlights of the project - where are you now? Where are you headed?" It can "[p]rovide information to the team - allowing the team to make informed decisions." With the information we provide managers they can make the decisions they need to about the product / project.

The testing we do on projects is a hedge on the risk of the product. Sure testing is a cost center but its a hedge against loosing customers because companies don't know anything about the products they are shipping. In order to help with this hedge, Lone Testers in an Agile Development Team need to be agile themselves. There's plenty of room for "testing" but we need to broaden our toolkit - we (testers) need to adapt to different expectations. A good example of this is when you move from one software shop to another.

Lone Testers should automate where you can. If you don't have the skills that's ok. You can start small while you learn. You don't need to create huge amounts of automation, just something alone the lines of 'Sunshine tests' (or smoke tests) - a small set of tests that can help you look at the broad picture of the software. This is the perfect thing for a Lone Wolf to attack first. Michael's a fan of the 'when in rome strategy' which means you look at whatever languages, tools, your development team uses and you use the same. If they are using Junit, you can use Junit for your tests. Then you can share the results or problems with them and they'll be more likely to help you because they see the common connection.

The disadvantage to automation is it's passive "checking" as opposed to active testing. When you spend time building automation it means you aren't spending time exploring and/or testing. Tests can quickly become stale and you have to consistently upkeep those tests.

[Michael: Clarifying. I meant that having the machine just running the automated tests and accepting the passes at face value is passive checking. The process of developing the automation goers through lots of iterations, with its own debugging and learning, so to be clear, the creation process of automation involves a lot of direct active testing.].

Testers are more than people looking through tests, trying to break things. Testers are like anthropologists:
  • Observe the development team
  • Look for feedback form actual users
  • Work with content developers
  • Discover the underlying and unspoken culture
Being a lone tester requires that you are a good communicator. You've got to be able to build bridges in the company, talk with developers and product owners and this includes all the different languages that each speak. Its about playing well with others (think of pairing). Paring can be tester/developers, tester/support person, tester/customer, tester designer. Lone testers should participate with planning and standups with the development team and they should become a domain expert about the customer needs.

Lone testers need to develop your craft. This is hard to do when you are the lone person, you'll have to reach outside your organization. Mentors will come from other sources, sure they can be a manager or developer if they've got the time to work with you. Otherwise you can reach out to other Lone Testers in other organizations that you can learn from. Michael is reciting some of the values of Context Driven Testing school of testing. Good software testing is a challenging intellectual process. This requires judgement and skill exercised throughout the project.

Lone Wolves, you are not alone. (For some reason I hear the Michael Jackson song in my head now. Weird) You have many allies you just aren't aware of them. Take advantage of the meetups in your area. Beer, pizza and some similar challenges are the basis for many meet ups. In-company people like developers, support people and designers can also provide feedback and support. Remember everyone in the Agile team does testing - its not all on you.


... and with that, I'm back. My thanks to Chris Kenst for being "virtual me" for a bit. I'm done. Deep breath... and now the room is being taken apart. This is the oficial end of the "formal conference" but as we all know, the conference may be over, but the conferring can still happen... and that's where I'm going now. I have a lot of new friends to talk to and learn from.

See y'all later :).

No comments: