Wednesday, December 4, 2013

Live from Climate Corp, it's BAST!!!

Another day, another Meetup, and this time, I don't have to present, so I can do what I usually do at these events, which is stream whatever flows into my brain and capture on bits to share with the rest of you.


Again, as in all of these events, this will be raw, dis-jointed and potentially confusing, but the benefit is you get to hear it here and now as I hear it (well, for the most part). If you want fully coherent, wait a while ;).



So for those curious, here's where we are at tonight, and what we are covering:



"Productivity Sucks, How About Being Effective" an evening with Jim Benson

Wednesday, December 4, 2013

6:00 PM - 8:30 PM

The Climate Corporation

201 3rd Street #1100, San Francisco, CA


Jim Benson, the co-author of "Personal Kanban", and a contributing author of "Beyond Agile: Tales of Continuous Improvement", is here to talk about  about the myths surrounding our work and how we think of it, specifically around how we determine what is productive, and isn't. Tonianne DeMaria Barry, the co-author of "Personal Kanban" (and his partner at Modus Cooperandi) will also be sharing some of her experiences from a variety of successful "kaizen" camps that have been held around the world.

What we are hoping to do with tonight's talk (and several more in the future) is to expand the range of topics that get covered in a typical software testing Meetup. Our goal is to help develop a  broad cross section of skills for testers, not just those in the nuts and bolts of direct and specific testing skills, or programming/toolsmith topics (nothing wrong with those, of course, we have them, too).

At the moment, though, we are eating food, drinking beer, wine and soft drinks, and conversing. Thus, I feel it vital to schmooze and welcome our guests, but I will be back shortly ;).

---

Jim stated our talk tonight about how he was able to set up a variety of opportunities selling Agile Methodologies to organizations (businesses, government, etc.), and realizing that many of the Agile Methodologies were, well, problematic. While working through some of the issues, they opted to try to apply Lean principles and, in the process, developed a variety of methods around a "kanban" system ("kanban" being a Japanese term that means "ticket" or "the card that moves"). Anyway, that's Jim, and that's what he said he wanted to get out of the way right now.


What he really wants to talk about is the fact that, if you work for a team or company that prides itself or markets itself as a "highly productive team", it's very likely that you are working in the worst environment possible. Wait, what?! Why would that be the worst possible things?


Part of the reason is that with that "high productivity" comes a lot of gamesmanship. It's also incredibly subjective; productive according to whom? Do they mean the team as a whole,? Do they mean the development methodology? Do they mean how much they push out? Who is defining or describing the "productivity"? We love to believe that everyone is all on the same page at the same time, and everyone is working in tandem.


Anyone who is in testing knows that this is rarely the case, unless you are fortunate to have the opportunity to pair with the developers as they code and you are riding along as a testing navigator (and yes, I do that from time to time, but not always, and not nearly as often as I would like to).  More times than not, we get our stories dropped in a group at the end of the coding time, and testing then spins up and frantically tries to get the testing done.


Jim makes the point that we are seeing an increase of productivity in some aspects, but we are seeing a proportional decrease in actual effectiveness, because much is getting done, but little is being accomplished (or it's overloading the downstream parts of the process, i.e. those of us in testing or ops).

So how can we solve this? First is recognize that productivity silos exist, and that they are evil. The more functionality that is sandwiched into one role, regardless of how productive they are, they are not going to be able to increase the entire teams ability to produce, release, or deploy because while one group is hyper-optimized, other groups are woefully under-prepared and over-burdened, because they do not have a complementary option. think of trying to fit a 12 foot diameter water pipe and its flow through a connecting pipe that is only three feet in diameter. Doesn't matter how much you put into the 12 foot pipe, the three foot pipe is going to be a bottleneck.

Think DevOps... and before anyone thinks "DevOps" as a team, it's not. It's a mindset and an approach. The goal, though, is that all of the teams need to be able to get the optimization that the programming group has. For the effectiveness of such a team to get better, all of the connection points need to be addressed, and all of the players need to be on the same page. That could mean many things. Sometimes it means that some of the programmers are going to be up in the middle of the night when something goes wrong ;). Silos are easy to talk about, but very hard to optimize and balance in reality.

"Productivity is just doing lots of stuff". Actually, James used a more colorful metaphor, but you get the point ;). Bad productivity is a reality. Lots of stuff is getting done, but it is really worthwhile? Is the chase for the almighty "velocity" really a worthwhile goal? Are we actually adding to the value of what we are creating? Or are we creating technical, intellectual, and expectational debt?

One of the goals behind Personal Kanban is to make it a pull system, where we grab what we can work on as soon as it's available. There are a variety of impulses that drive this, Demand pull, of course, is that the market is telling us "hey, we want this, please make it for us". Internal pull is when our internal voices of our companies are saying "we need this and fast" without any correlation to what our customers want. Aim for the former, but let's do all we can to resist the latter.

One of the real challenges of what we produce in a software centric world is that our "product" is extremely ephemeral. What we produce is difficult to visualize. Because of that, we have to be mindful of exactly what we are making, how that stream is created, and what has to happen from start to finish. The tricky part is that, more times than not, the initial value stream starts from the end and works its way backwards. What results, very often, is that we are missing stuff, and we don't know what we are missing, or why. If we are focused on productivity, we do not have any incentive to seek out the holes that exist. To be effective, we need to be aware, as much as possible, where the deep holes might actually be, and find them. Quickly is best :).

One of the concepts that we bring with Kanban, and that comes from the world of manufacturing, is the idea of "inventory". The goal is to have the right amount of materials on hand, but to not have too much inventory. Think of an auto manufacturer that produces tens of thousands of cars, only to discover that the market has no use for their cars, and therefore, they have produced tens of thousands of cars that no one wants. We'd see that as suicidal. Well, that works with code, too. When we write code that is bloated, filled with features that no one really wants, we are doing the same thing. We don't want to write a lot of code, we want to write the right code, an make sure that the right code WORKS!!! Note: this is not an Agile thing, or a Lean thing, or a Waterfall thing. It's a human thing, and we all feel its effects.

OK, so we have your attention. Productivity bad, effective good. That's great, but how can we use this in our testing? How can we recalibrate ourselves to a mindset of effectiveness? The most difficult thing is that, when we find that we have a back log of done things, rather than pushing those downstream to work harder and process more, we need to actually stop doing things and examine issues that might be "done" and see if "indeed" it really is (think of a story delivered two weeks ago, but that testing just got to today, and in the process, they found a bug. What does a programmer do? Well, very likely, they have to go back two weeks in memory to think about what they actually did, and that context switch is expensive. Compare that to programming happening and testing happening in a tight feedback loop. Which approach do you think will be more effective? Why?


Chances are, because if we can address something shortly after we worked on it, we would be fresh and remember where we were, what we were doing, and how we might be able to address it. Delays make that process much harder. Therefore, anything that creates a delay between Done in one sphere and Done in another will cause, surprise, more delays! the best thing we can do is build a narrative of our work that is shared and that is coordinated. If we realize that we have ten stories in backlog, the best thing we could do is stop adding to the back log.

Jim brought up the Buddhist concept of "mindfulness", and the ability to be mindful of the shared story comes to the fore when we are not overloaded and focused on production at the expense of everything else. Avoid becoming task focused, aim to be situationally aware. Specifically look for opportunities to collaborate. That doesn't necessarily mean "pair programming" (that might be a better be described as "teaming"), but it does mean try to find ways that we can leverage what each other is doing, not just to get stuff done, but to more likely ensure that we are doing the right things at the right time, in the most effective way.


One very special aspect that needs to be addressed... there are things that are just plain "hard". Very often, the way we deal with hard problems is we move over to easier things to work on, and when we get them done, we feel accomplished. It's human nature, but it doesn't address the fact that the hard stuff is not getting dealt with. the dancer is that, at the end of the sprint/iteration/cycle/etc. all that's left is the hard stuff, and now we have a danger of not being able to deliver because we didn't address the hard stuff until we reached a point of no return. when we get to things that are complex, it doesn't necessarily mean that we have failed. It may just mean that we need more than one person to accomplish the task, or that we need to have a more thorough understanding of what the actual goal is (it may be vital, or it may not even be necessary, but we will not know until we dig into it, and the longer we wait to make that dig, the less likely we will get a clear view of where on the spectrum that "hard problem" lies.

Back to testers in this environment. How can we help move towards effectiveness? Put simply, get us involved earlier, and have us test as soon as humanly possible. We don't have to test on finished code. we can test on requirements, on story points, on initial builds to check proof of concept. Sure, it's not finished, it's not elegant, but we don't really care. We don't want to polish, we want to check for structural integrity. Better to do that as early as humanly possible. We can focus on polish later, but if we can help discover structural integrity issues early, we can fix those before they become much mode difficult (and time sensitive) later.

---

My thanks to Jim and Tonianne for coming out tonight to speak with us, and thanks for helping us cover something a bit different than normal. I enjoyed this session a great deal, and I hope we can continue to deliver unique and interesting presentations at BAST Meetups. My thanks also to Curtis and Climate Corporation for the space and all the leg work for making this happen. Most of all, thanks to everyone who came out to hear tonight's talk. You are making the case for us that there is definitely a need for this type of information and interaction. Here's wishing everyone a Happy Hanukkah, Wonderful Winter Solstice, Merry Christmas, Happy New Year an any other holiday I might be overlooking, but we wish you the best for the rest of this crazy active month. We look forward to seeing you all again sometime in mid January, topic to be determined ;).

No comments: