Showing posts with label productivity. Show all posts
Showing posts with label productivity. Show all posts

Tuesday, October 15, 2019

Being More Agile Without Doing Agile - a #PNSQC2019 Live Blog


Can I share a possibly unpopular opinion? I am not a fan of "Agile".

Now wait, let me clarify. I love BEING agile. Heck, who doesn't? I don't have a problem with the adjective. I have a problem with the Noun.

Also a confession. I'm here mainly because Dawn Haynes is talking. I've known Dawn for years and the irony is that I have had precious few times that I have actually been able to hear Dawn speak. Thus I consider this a perfect blend of opportunity and attitude :).


I like "little a" agility. Again, the actions and abilities. Those are all good things. They are helpful and necessary.  I like being nimble and quick where I can be.

What I have found less appealing in "Big A" Agile. Mainly because I find that when organizations try to implement "Big A" Agile, they become anything but "little a" agile.

As a software tester, I have often found that there is an afterthought when it comes to testing in Agile implementations. More times than not, what results are teams that kind of, sort of, maybe do some Agile stuff and then retrofit everything that doesn't actually feel right into a safer space.

Dawn emphasizes that the best way to achieve the goals of "Agile" is to actually "be agile". In other words, forget the process (for a moment) and focus on yourself and what you’re trying to accomplish.

A Comfort Zone is a Beautiful Place but Nothing Ever Grows There

For teams to get better, they have to be willing to go to places they don't really want to go to. There is a fear that going into the unknown will slow us down, will send us down paths we are not thrilled about going down, may not even get us to our end goal quickly. So we put a lot of emphasis on what I call "the priestly caste" and "the temple incantations". I'm not trying to be flip here, I'm saying that there are a lot of rituals that we reach for when we are not 100% sure about what we are or should be doing. As long as the rituals are met, we see comfort there, even if the rituals are adding little to no actual benefit. Are retrospectives helpful? They can be if they are acted upon. If they aren't then it is an empty ritual. Granted, it may take time and commitment to see the results of the retrospective findings and real results may not be manifest for weeks or months. Still, if we do not see that there are actual improvements coming from those retros, what is the point of doing them?

One of the interesting developments on my team related to agility and moving more quickly and effectively was to allow myself to wear whatever hat was needed at the moment. I'm not just a tester. Some days I'm a part-time ops person. Some days I'm a build manager. Some days I'm a release manager. Some days I've been a Scrum Master (and, in fact, I was a dedicated Scrum Master for three months). I was still a tester but I did what was needed for the moment and often that meant not being a "Tester" but always being a "tester"... see what I did there ;)?

Are test cases necessary? It depends on what you define as a test case. In my world, I go through a few waves of test case development. Almost never do I start with some super detailed test case. Typically I start with a 5000-foot view and then I look to get an idea of what is in that space. I may or may not even have a clear idea of how to do what I need to do, but I will figure it out. It's the process of that learning that helps me flesh out the ideas needed to test. Do I need to automate steps? Sure, but generally speaking, once I automate them, if I've done it right, that's the last time I need to really care about that level of granularity. Do I really care if I know exactly every step necessary to complete a workflow down to the property IDs needed to reference the elements? No, not really. Do I need to know that a unique ID name exists and can be used? Yes, I care a lot about that. In fact, that's about the most important finding we can make (see my talk about "Is this Testable?" about more of my feelings on this :) ).

The key takeaway, care more about the work and about being nimble than bowing to the altar of AGILE. I find much to value in that :).

Tuesday, November 14, 2017

The Frustration of "Too Much Choice"

Hello, Internet world. My name is Michael. I'm a serial collector of informational tidbits.

"Hi, Michael!"

Seriously, I recently went through and realized something both frustrating and enlightening. I am a huge fan of Bookmarking and Favoriting (Liking on Twitter now, but I still think of it as Favoriting). In my world, "Favoriting" serves a specific purpose. It's not so much to say "hey, I want to show you I like what you've posted" (though I do that from time to time) but to say "this is something I don't have the time to look at right now, but I really want to look at it later". I subscribe to lots of services that send me emails with cool tips and tricks to test, code, and administer stuff. I have a digital library that has hundreds of titles on all sorts of topics. I have categorized listings of websites, forums and other services that are there to help me learn and do things better and easier.

The thing is, when I get up in the morning and I scan my Inbox, most of the time I just delete the notifications, unless there's something that really piques my interest.

Those links? Rarely visited.
That list of Favorites (Likes) on Twitter? Rarely reviewed.
That massive list of books? It's so big that most titles hide in plain sight.

I remember Cem Kaner saying at one point that having the information doesn't necessarily mean that it will be useful to you at that moment, but being able to reference it and know about it or where to find it is of value. Thus, for many of us, resources are just that, they are raw lumps that are there when and if we need them, but we have to understand what we have access to and when that access is relevant.

For me, I struggle with too much choice. If there are too many options, I simply get overwhelmed and never make a decision. It's all clutter. It's a challenge to organize it. I have a couple hundred CDs and whenever I go on a road trip, I spend a ridiculous amount of time trying to pick something to listen to. Often, I give up and listen to the podcast I downloaded on my phone. Oh, that's another thing, which podcast to listen to and when? So many choices, so many options, but do I really have time for a deep dive? Have I truly let that one podcast build up to ten unlistened episodes? Yikes! When am I going to find the time to listen to that? Since my phone has a limited amount of storage, I tend to be a little more deliberate with what goes on it and I cycle what I download, so I have fewer choices. The net result is that I actually listen to what I put on the phone.

As I've stated in this blog before, I don't write about these things because I'm particularly good at them. I write about them because I'm particularly terrible at many things but want to do better. Thus, I'm trying my best to constrain those things that overwhelm me. Yes, I belong to a service that lets me download a free ebook every day. Many (perhaps most) of those books are "someday maybe" propositions that tend to crowd out the books that are actually immediately relevant. Therefore, I'm trying something different. Each week, I'm going through a particular category of expertise and/or criteria I need to understand or become more proficient with. I'm looking at this from a Marie Kondo approach. I'm looking at the resources I've collected, taking some time to categorize them into "immediately relevant", "relevant later", and "someday maybe". My goal is to locate the items that are immediately relevant and then focus on those for a short period of time.

In other words, I'm putting a physical constraint on the information I have an use, not to block out all of the resources I have, but to meaningfully work on the ones that can be most effective here and now. It's great that I have books that will help me master a particular technology, but if I'm just learning about it or trying to get beyond the Advanced Beginner stage, do I really need to deal with topics that relate to mastery at this stage? No. Yet just by their being there in my line of sight, I lose focus and my attention wanders. I also do something similar regarding other endeavors in my office. I have a lot of interests and it's tempting to have a variety of things out and ready to use. The net result, though, is that I dabble in lots of things and don't put any appreciable time into the areas that are most important. Frequently I end up dealing with what's urgent or pressing, and that's great for the moment, but it can leave me lacking in areas that are indeed important but aren't urgent.

I'm not sure if this is going to be helpful to anyone else, but it's currently helping me. Take some time to block out items you want to work on, that you need to work on and then think of the things that will directly help you meet those goals in the very near-term future. If they don't, don't delete them but perhaps put them in a place where you know they will come in handy later, and try to set a hard time for when "later" might be. If you can't do that, put them in the "someday maybe" container. The ability to pick and choose is wonderful, but sometimes, it helps a lot to limit what can be picked so that you actually make a choice and move forward with it :).

Monday, May 8, 2017

The Keyboard Challenge: 30 Days of Accessibility Testing

The Ministry of Testing has declared that May should be "30 Days of Accessibility Testing". As in the days of yore when I used to take on these challenges and blog regularly, I'm in the mood to get back to doing that. Therefore, I am looking to write a post every day around this topic and as a way to address each line of their checklist.

This is my second entry for May 8, since I was away for the previous weekend. This brings me up to "day 7":

7. Unplug your mouse, leave the touchpad alone, and navigate using the keyboard.

To tell the truth, this is one of the harder things to do on a computer in this current arc of evolution. Many sites are designed to use the mouse, and getting in and using the keyboard exclusively is challenging in many apps. I'm typing this entry with a minimal amount of mouse usage possible, trying my best to use just the keyboard to create and publish this post, as well as publicize it by launching a Tweet when I finish. That means that I have had to basically do away with many of the traditional modes of using the Blogger Editor. Many of the toolbar options are difficult to reach without using the mouse (there are keyboard analogs for several but not all of the options).

For consistency's sake, I found that it was easiest to just tab to the [Compose|HTML] toolbar and select HTML to do the active editing. It's a good approach if you remember your HTML formatting tags, but a lot of the workflows that I actively use don't find this to be a natural fit.

Tabbing over to get to the toolbars, I can enter the tags I want to use, but actually selecting the "Done" button requires tabbing through the entire tag list to get to the button. I'm sure there's a way to get to it quicker, but again, that requires me to dig into some knowledge base somewhere and learn exactly what that is. Frustrating? Believe it! Empathy building? Oh yes!!!

Tabbing to the Save button? Mission Accomplished. Preview? Successful. Umm, OK, how to get back to the editing tab? Umm... [alt]+[command]+[right arrow]... phew! OK, do I publish now? If so, the [Shift] key and seven tabs will get me to the publish button. Cool, now I'm on the Posts page. Select View... umm... tab over it and it disappears. How do I view it? Umm... dang it. Mouse over and click "View"... there it is. I admit I feel defeated at the moment. Still, a couple more tabs and I can highlight the Twitter icon at the bottom of the post. Hit Enter, type in some text, and tweet about this post. Done!

Well, That was quite a workout. It's also eye-opening to see just how much of what we do is optimized for point and click use, and how much additional knowledge and practice is required to be a keyboard ninja in many apps. Some standard keystrokes are familiar to everyone, and they tend to be standards that are adhered to, thankfully, but some more "obvious" operations are a bit more challenging to pull off. Needless to say, getting exhaustive with the keyboard goes well beyond this little blog post, but if the goal is to get the user to understand and empathize with how challenging it is to just use the keyboard to complete many workflows, it's abundantly clear to me at the moment that that is the case.

Now if you will all excuse me I'm going to go take a break. I'm tired ;).

Friday, August 14, 2015

Our Python "Esperanto Project"

Much of the time, my work environment is not pretty. It's not the elegant situations that are spelled out in books or in "best practices" guides. Often, there are things that would look convoluted to outsiders, that seem like strange and quirky paths to get to places and to accomplish tasks that seem, well, not at all ideal. Why do we use them? Because it works. More to the point it has worked for years, and the thought of ripping out everything and starting anew would be a tremendous loss.

I have been discussing with my daughter ways that we could get more involved in and create an environment that we can both use, both agree on, and both work in and understand what each other is doing. In other words, we both decided we would put together our own "Esperanto Project" to help each other learn interesting tools, try out various frameworks and have an excuse to apply the ideas we are soaking up here and there and put them into an environment we can both work with.

Currently, our project resides on a Trusty Tahr build of Ubuntu Linux. The agreed to language for what we are going to do is Python, mainly because my daughter and I are both roughly skilled at an equal level (somewhere between novices and advanced beginners). As a web framework, we are using Django, because, well, Python. Selenium WebDriver is installed, with the idea that test scripts will be written in, you guessed it, Python. For my own fun, I am adding JMeter, Kali Linux and a few other tools to practice testing scenarios and particularly to exercise APIs, utilizing Python as the scripting engine. Finally, we are using PyDev as a plug-in to the Eclipse IDE because, hey, why not ;)?

One of the reasons I want to do this is that I want to be able to not just play with tools, but also have a way to keep the things I learn and find in an environment that can follow me from place to place. Each company uses their own set of tools and languages, and it's likely that I will not be using Python at different jobs. That's OK, since the goal is to not necessarily do a direct port of what I do from one company to the next, but instead, get to a point where I am able to develop and test an environment with a broad range of tools and become more familiar with all the possibilities, while also teaching my daughter how these tools are used. In turn, I'm hoping she will be able to teach me a thing or two later on down the road.

I joked with Kristoffer Nord yesterday via Twitter that his "Python for Testers" course would perhaps be an ideal jump start to this goal of ours. I'm looking at how I can make this into something interesting going forward, and I'd like to make regular updates to it and say where we are in the process. More to the point, I'd like to use it as a chance to ask for help here and there from the broader community, specifically the Pythonistas out there, because our goal is to use Python as the unifier of all the tools we pick, wherever we can.

It may work well, it may work terribly, but we won't know until we try :).

Friday, June 5, 2015

The Value of Mise en Place

I have to give credit to this idea to a number of sources, as they have all come together in the past few days and weeks to stand as a reminder of something that I think we all do, but don't realize it, and actually utilizing the power of this idea can be profound.

First off, what in the world is "mise en place"? It's a term that comes rom the culinary world. Mise en place is French for "putting in place", or to set up for work. Professional chef's use this approach to organize the ingredients they will use during a regular workday or shift. I have a friend who has trained many years and has turned into an amazing chef, and I've witnessed him doing this. He's a whirlwind of motion, but that motion is very close quartered. You might think that he is chaotic or frantic, but if you really pay attention, his movements are actually quite sparse, and all that he needs is right where he needs them, when he needs them. I asked him if this was something that came naturally to him, and he said "not on your life! It's taken me years to get this down, but because I do it every day, and because I do my best to stay in it every day, it helps me tremendously."

The second example of mise en place I witness on a regular basis is with my daughter and her art skills. She has spent the better part of the past four years dedicating several hours each day drawing, often late into the evening. She has a sprawling setup that, again, looks chaotic and messy on the surface. If you were to sit down with her, though, and see what she actually does, she gathers the tools she needs, and from the time she puts herself into "go" mode, up to the point where she either completes her project or chooses to take a break, it seems as though she barely moves. She's gotten her system down so well that I honestly could not, from her body language, tell you what she is doing. I've told her I'd really love to record her at 10x speed just to see if I can comprehend how she puts together her work. For her, it's automatic, but it's automatic because she has spent close to half a decade polishing her skills.

Lately, I've been practicing the art of Native American beading, specifically items that use gourd stitch (a method of wrapping cylindrical items with beads and a net of thread passing through them). This is one of those processes that, try as hard as I might, I can't cram or speed up the process. Not without putting in time and practice. Experienced bead workers are much faster than I am, but that's OK. The process teaches me patience. It's "medicine" in the Native American tradition, that of a rhythmic task done over and over, in some cases tens of thousands of times for a large enough item. Through this process , I too am discovering how to set up my environment to allow me a minimum of movement, an efficiency of motion, and the option to let my mind wander and think. In the process, I wring out fresh efficiencies, make new discoveries, and get that much better and faster each day I practice.

As a software tester, I know the value of practice, but sometimes I lose sight of the tools that I should have at my beck and call. While testing should be free and unencumbered, there is no question that there are a few tools that can be immensely valuable. As such, I've realized that I also have a small collection of mise en place items that I use regularly. What are they?

- My Test Heuristics Cheat Sheet Coffee Cup (just a glance and an idea can be formed)
- A mindmap of James Bach's Heuristic Test Strategy Model I made a few years ago
- A handful of rapid access browser tools (Firebug, FireEyes, WAVE, Color Contrast Analyzer)
- A nicely appointed command line environment (screen, tmux, vim extensions, etc.)
- The Pomodairo app (used to keep me in the zone for a set period of time, but I can control just how much)
- My graduated notes system (Stickies, Notes, Socialtext, Blog) that allows me to really see what items I learn will really stand the test of time.

I haven't included coding or testing tools, but if you catch me on a given day, those will include some kind of Selenium environment, either my companies or my own sandboxes to get used to using other bindings), JMeter, Metasploit, Kali Linux, and a few other items I'll play around with and, as time goes on, aim to add to my full time mise en place.

A suggestion that I've found very helpful is attributed to Avdi Grim (who may have borrowed it from someone else, but he's the one I heard say it). There comes a time when you realize that there is far too much out there to learn proficiently and effectively to be good at everything. By necessity, we have to pick and choose, and our actions set all that in motion. We get good at what we put our time into, and sifting through the goals that are nice, the goals that are important, and the goals that are essential is necessary work. Determining the tools that will help us get there is also necessary. It's better to be good at a handful of things we use often than to spend large amounts of time learning esoteric things we will use very rarely. Of course, growth comes from stretching into areas we don't know, but finding the core areas that are essential, and working hard to get good in those areas, whatever they may be, makes the journey much more pleasant, if not truly any easier.

Friday, March 6, 2015

Taming Your E-Mail Dragon

Over on Uncharted Waters, I wrote a post about out of control E-mail titled "Is Your Killer App Killing You?" If that may be a bit too much hyperbole, there is no question that E-mail can be exhausting, demoralizing, and just really hard to manage.

One area that I think is really needed, and would make E-mail much more effective, is some way to extend messages to automatically start new processes. Some of this can be done at a fairly simple level. Most of the time, though, what ends up happening is that I get an email, or a string of emails, I copy the relevant details, and then I paste them somewhere else (calendar, a wiki, some document, Slack, a blog post, etc.). What is missing, and what I think would be extremely helpful, would be to have ways to register key applications with your email provider, whoever it may be, and then have key commands or right click options that would let you take that message, choose what you want to do with it, and then move to that next action.

Some examples... if you get a message and someone writes that they'd like to get together at 3:00 p.m., having the ability to right there schedule an appointment and lock the details of the message in place seems like it would be simple (note the choice of words, I said it seems it would be simple, I'm not saying it would be easy ;) ). If a message includes a dollar amount, it would be awesome to be able to right click or key command so that I could record the transaction in my financial software or create an invoice (either would be legitimate choices, I'd think).

Another option that I didn't mention in the original piece, but that I have found to be somewhat helpful, is to utilize tools that will allow you to aggregate messages that you can review later. For me, there are three levels of email detail that I find myself dealing with.

1. E-mail I genuinely could not care any less about, but doesn't rise to the level of outright SPAM.

I am unsentimental. Lots of stuff from sites I use regularly comes to my inbox and I genuinely do not want to see it. My general habit is to delete it without even opening it. If you find yourself doing this over and over again, just unsubscribe and be done with it. If the site in question doesn't give you a clear option for that, then make rules that will delete those messages so you don't have to. So far, I've yet to find myself saying "aww, man, I really wish I'd seen that offer that I missed, even though I deleted the previous two hundred that landed in my inbox. Cut them loose and free your mind. It's easy :).

2. Emails with a personal connection that matter enough for me to review and consider them, but I may well not actually do anything with them. Still much of the time, I probably will.

These are the messages I let drop into my inbox, usually to be subject to various filter rules and to get sorted into the buckets I want to deal with, but I want to see them and not let them sit around.

3. That stuff that falls between #1 and #2.

For these messages, I am currently using an app called Unroll.me. It's a pretty basic tool in that it creates a folder in my IMAP (called Unroll.Me), and any email that I have decided to "roll up" and look at later goes into this app, and this folder. There's some other features that the app offers, such as Unsubscribing (if the API of the service is set up to do that), include in the roll up, or leave in your Inbox. Each day, I get a message that tells me what has landed in my roll up, and I can review each of them at that point in time.

I will note that this is not a perfect solution. The Unsubscribe works quite well, and the push to Inbox also has no problems. It's the Roll up step that requires a slight change in thinking. If you have hundreds of messages each day landing into the roll up, IMO, you're doing it wrong. The problem with having the roll up collect too many messages is that it becomes easy to put off, or deal with another day, which causes the back log to grow ever larger, and in this case, out of sight definitely means out of mind. To get the best benefit, I'd suggest a daily read and a weekly manage, where you can decide which items should be unsubscribed, which should remain in the roll up, and which should just go straight to your inbox.

In any event, I know that E-mail can suck the joy out of a person, and frankly, that's just no way to live. If you find yourself buried in E-mail, check out the Uncharted Waters article, give Unroll.me a try, or better yet, sound off below with what you use to manage the beast that is out of control email. As I said in the original Uncharted Waters post, I am genuinely interested in ways to tame this monster, so let me know what you do.

Friday, February 20, 2015

If You Have $20, You Can Have a Standing Desk

I've had a back and forth involvement with standing desks over the years. I've made cheap options, and had expensive options purchased for me. I've made a standing desk out of a treadmill, but it has been best for passive actions. It's great for reading or watching videos, not so great for actually typing, plus it was limiting as to what I could put on it (no multi monitor setup). Additionally, I've not wanted to make a system that would be too permanent, since I like the option of being flexible and moving things around.

I'm back to a standing desk solution for both home and work because I'm once again dealing with recurring back pain. No question, the best motivation for getting back to standing while working is back pain. It's also a nice way to kick start one's focus and to get involved in burning a few more calories each year. I'll give a shout to Ben Greenfield, otherwise known as "Get Fit Guy" at quickanddirtytips.com for a piece of information that made me smile a bit. He recommends a "sit only to eat" philosophy. In other words, with the exception of having to drive or fly, my goal should be to sit only when I eat. All other times, I should aim to stand, kneel, lie down or do anything but sit slumped in a chair. The piece of data that intrigued me was that, by doing this, I'd be able to burn 100 additional calories each day. In a week's time, that's the equivalent of running a 10K.

For those who have access to an IKEA, or can order online, at this moment, the Ikea LACK square side table, in black or white, can be purchased for $7.99 each. I chose to do exactly that, and thus, for less than $20, including tax, I have set up this very functional standing desk at work:




The upturned wastebasket is an essential piece of this arrangement. It allows me to shift the weight from leg to leg, and to give different muscles in my back some work, and to rest other areas from time to time. I've also set up a similar arrangement at home, but added an EKBY ALEX shelf (I'll update with a picture when I get home :) ). This gives me a little extra height and some additional storage for small items. The true beauty of this system is that it can be broken down quickly and moved anywhere with very little effort, and is much less expensive than comparable systems I have seen. If you'd like to make something a little more customized, with a pair of shelf brackets and a matching 48" shelf, you can make a keyboard tray, though for me personally, the table height works perfectly.

What I find most beneficial about a standing desk, outside of the relief from back pain, is the fact that it is incredibly focusing. When I sit down, it's easy to get into passive moments and lose track of time reading stuff or just passively looking at things. When I stand, there is no such thing as "passive time". It's very focusing, and it really helps me to get into the zone and flow of what I need to do. For those looking to do something similar, seriously, this is a great and very inexpensive way to set up a standing desk.

Wednesday, November 19, 2014

From TV to Social to Accessible: "Meandering" Through 23 Years of Software Testing

An interesting milestone just happened today, and I wasn't really even aware of it. I came in to work this morning and received a nice message from a co-worker in another division wishing me a Happy 2nd Anniversary with Socialtext.

Wow, two years?!

It seems like so much time in some ways, and it seems like so little as well. I remember first getting here, being nicely thrown off the deep end into our setup and getting my development environment running, and acclimating to life in a testing team after so many years of going it alone. Within those two years, we have seen changes, we've seen ups and downs, we grown and contracted, we've realigned and integrated, and I've had my chance to learn a few new tricks.

Looking back, I realized that I hadn't updated my Meandering posts for the last couple of years. This is meant to be a chronicle of my career path and the many places it has taken me. It also seems appropriate to mention a few other things that are taking place in my reality that might explain the limited postings lately here on my blog. For those curious about both, please read on.

As we last spoke, I was working with a young and energetic Agile team in the guise of Sidereel.com. The work I was doing introduced me to performing automation in a way that was unique to a front facing role, while still allowing me to delve into writing automation for myself and my efforts. The code was treated like development code, checked into Git, distributed with each release, and expected to run and pass before we went forward with deployments. It was an enjoyable experience and gave me a chance to dive into Ruby and Rails more than I had previously. I dedicated much of my free time to absorbing Zed Shaw's book "Learn Ruby the Hard Way" and putting those skills into practice. Any book on Cucumber, BDD, ATDD, Ruby or automation I could get my hands on was fair game, and I devoured as many of them as came my way.

Part of me was hoping that this would light a spark in me to become more enthusiastic about writing code and taking on more coding responsibilities. Well the latter definitely proved to be true, and I will say that I found myself learning quite a few neat tricks around object-oriented programming. I also picked up some understanding of modern web design and full stack development. However, that boost in joy and confidence and a surging love for programming, that didn't quite happen. In some ways, I still look at writing code the same way I do about cleaning out the garage. It's a necessary task at times, but not something I want to do every day. Actually, to be honest, I like cleaning the garage a little more than that.

One indelible change in my life happened while I was at Sidereel, and it's one I will feel the after effects for the rest of my life. On August 29, 2011, while I was riding my skateboard from work to the train station, I hit a crack in the road, and I was sent flying. The landing was bad. I broke through my right tibia near my ankle, and snapped my right fibula up near the knee. The resulting surgery, and plate put in my leg to allow the tibia to knit back together, took me of my feet for a month, and made it necessary for me to work from home full time for six weeks, until I could get back to walking far enough to make the trek from the train station to the office. The resulting experience showed me that I could be effective both in person and even when I was on my own at home for an extended period. At the same time, it made me start to wonder how much of the team I actually was, if I could do so much of my work outside of the core scrum team.

As many testers have probably seen, integrating with an Agile team is often an unusual dance. I would have to say that Sidereel did a good job of balancing the programming requirements and fulfilling the goals and objectives that makes for an Agile team, but more times than not, I felt like my being an embedded tester was a secondary thing. The Programming team was Agile, but I was a Waterfall tester. to be fair, I do not think that was at all the intention, but that's just how it worked out. Often I found myself struggling to understand what issues really mattered, what areas belonged to who, and how I could communicate effectively. If I asked too many questions, I was a disruption. If I asked too few questions, I was aloof. More to the point, I found that I didn't have anyone else I could really talk to if I had questions about my methodology and approach, and how I could work more effectively with the team. Over time, this became a wedge. I didn't want to see it then, but I can now appreciate the fact that my desire to insert myself inside of the team early in the process was what I wanted to do, but it wasn't what they wanted. My view of an Agile tester and the teams view were different, and I didn't appreciate that fact until later.

As a lone tester on such a massive product with so many moving parts and interactions, I had my share of misses and "whoops" moments. There were times when things that seemed like obvious "why didn't you catch this" situations happened regularly enough to where I just didn't want to talk about them. In my mind, just keeping up and doing my job, and being careful were the essential elements to being successful. However, that slowed me down, and I was perceived to be the bottleneck. For many of the situations, I think it was absolutely correct.

One of the most frustrating situations to find yourself in is to have one on ones with your director and to hear "you are doing a good job, but...". As the months progressed, I felt like the "but" in that line was becoming more and more pronounced. You missed this issue. You are taking up too much of the developers time. You need to be more independent. You need to focus on deeper and more critical bugs. Every one of these was true, don't get me wrong, and every time I sought to do exactly that, yet every time we cam back around, it felt like the same conversation. My friend Matt Heusser calls this situation the "I want a rock" problem. The idea is that someone asks for a rock. When you bring them a rock, they say "that's not the rock I want, I want a different rock". Over time, as we keep bringing different rocks, it becomes clear that what is being requested is not the rock being presented, it's who is doing the presenting. It was becoming clear to me that being a Lone Tester in this environment was perhaps too much for me. Perhaps if we had all been clearer on the expectations from the outset, or how we all wanted to work together, the outcome could have been different. Regardless, I found myself in a situation where there was mutual frustration. the programmers weren't happy, and neither was I.

I sent out a message to a few of my friends, saying I was feeling frustrated, that perhaps my role as a Lone Tester was burning me out, or that I was possibly not the best fit for Sidereel at this stage of the game. In short, I wanted to explore other options. I was open to managing a team, or mentoring other testers, or doing something with an organization where my only stipulation was "I don't want to be a Lone Tester this time around". Shortly after I sent that message, my friend Ken Pier contacted me and asked if I'd be interested in joining his team at Socialtext. Ken had met me two years before through Matt. He'd been to several conferences and meetups where I had participated and been a correspondent. He'd attended a few of my talks, including my "Balancing ATDD, GUI and Exploratory Testing" presentation. He had also read my blog for the past couple of years. In short, Ken knew what I could bring to the table, and what I could not. He explained that he was building a team of seasoned testers; he wanted people on his team who really understood software testing, and he proved to me that he understood it to, as well as what it can and cannot do. Through further conversations with him, and some casual conversations with some of his co-workers, I made the decision to hang up the Lone Tester mantle and focus on being a team tester once again.

Over the past two years, I have worked with a team of awesome people. They introduced me to a Kanban approach to software development, had me get intimately involved again in automation, albeit in a way that was totally different than anything I did at Sidereel. I also was asked if I'd be willing to take on a special initiative. We had performed an accessibility audit and had discovered many areas where we needed to make improvements, and the testing needs for that would require someone willing to spend a fair amount of time with screen reader and dictation software, among other things. I decided that, yes, I would be interested in doing that. I had no idea at the time that what I'd be asked to do would have such a fundamental effect on what I do, but in the ensuing two years, accessibility has become one of my core competencies, and the language of and advocacy for accessibility features would come to permeate so much of my world view. Prior to this, I had next to no understanding of what Accessibility really meant. Now, I'm the go to person.

Another thing that was also very helpful is the fact that Ken (my director) is a member of the Association for Software Testing. He understands my involvement as a board member and as an instructor, and has actively encouraged the initiatives we have championed through AST be used here at Socialtext. I've also been able to use Socialtext as an application for various testing challenges, such as Weekend Testing and the Software Testing World Cup. that feedback has helped us make a better product, and that connection to a broader community of testers has been a long term commitment of Socialtext as well (did I mention that Matt worked here before I did? If I didn't, let me fix that. Matt worked at Socialtext before I did, so he plowed a lot of ground that allowed me and the rest of my team to reap the harvest he'd sown years before).

This year, I was selected to be the President of AST, with the full backing of my team at Socialtext. Being able to be the advocate for testing that I see myself being, as well as doing work at a company that is fun, engaging and interesting, with a team of people with as much experience as I have, if not more, and the opportunity to mentor younger testers when we get the opportunity... it's been a lot of the reason these past two years feel like they have gone by so fast.

So what pearls of wisdom can I share since my last Meander?

-- it's important to understand how your team works. More important is understanding how your team wants to have you work with them.  It may seem trite or silly to say "make sure you understand what your programmers want from you as a tester", but it's not a trite question at all. They are your customers, you provide a service to them. It is in your best interest to understand what makes them comfortable and helps make for a mutually effective relationship.

-- you cannot test quality in, you can only tell people where issues may be. This keeps getting hammered into me everywhere I work, but I think it bears repeating. At the end of the day, I do not make the choices as to what ships and what doesn't. I can be overruled, and I need to be OK with that. If my information helped them reach a decision, then that is good enough.

-- being a lone gun is hard, and it can be lonely. Moving to a team does not automatically fix that. A team has to engage in a different way, but it also require a meshing of personalities and work styles. I came in with an idea of "I know testing, I'll be able to swing everyone around to my way of thinking in no time. truth is, I learned a whole lot more from my team than I thought I would, and realized that I didn't have all the answers. Not by a long shot! With a team of seasoned veterans (of which I am one of the younger ones, I might add ;) ), we've melded our skills and abilities quite well.

-- be willing to take on a "problem child" issue when it is presented to you. When the opportunity came up to work on accessibility issues, no one else was enthusiastic about working on it. I could say "ignorance is bliss" because I said "sure, I'll do that" with the attitude of "how hard could it be?". Well, when you find yourself surrounded by machines that are rapidly talking at you, or you find yourself plugging in a headset mike to "talk to" your computer on a daily basis, it can get irritating. Still, working through the irritations can help you develop unique proficiencies, and then being the person people can rely upon to consult on issues around that area makes up for a lot of annoyances, and now, I actually like tinkering with the accessibility tools.

-- automation is an ongoing need, and more and more businesses are utilizing it to get the repetitive stuff off of their plates. Do not fight this, in fact, do your best to become part of the flow. Whether that means you actually program, or consult with those who do, put yourself into the process. What you learn may have large impact on the amount of time you spend doing repetitive tedious busywork. If you can automate that busywork, then you save your eyes and your energy for much more interesting territory to explore :).

My thanks again to Socialtext for what has thus far been a very rewarding and engaging two years. I look forward to what tomorrow brings!

Thursday, August 7, 2014

Stepping Back, Taking a Breath, Letting Go, and Saying "NO"

Many will, no doubt, notice that my contributions to this blog have been spotty the past few months. There's a very specific reason.

A few months back, I did an experiment. I decided to sit down and really see how long it took me to do certain things. I've been reading a lot lately about the myth of multi-tasking (as in, we humans cannot really do it, no matter what we may think to the contrary). I'd been noticing that a lot of my email conversations started to have a familiar theme to them: "yeah, I know I said I'd do that, and I'm sorry I'm behind, but I'll get to that right away".

Honestly, I meant that each and every time I wrote it, but I realized that I had done something I am far too prone to do. I too frequently say "yes" to things that sound like fun, sound like an adventure, or otherwise would interest and engage me. In the boundless optimism of the moment, I say "sure" to those opportunities, knowing in the back of my mind there's going to be a time cost, but it's really fuzzy, and I couldn't quantify it in a meaningful enough way to guard myself.

I decided I needed to do something specific. I purchased a 365 day calendar (the kind with tear off pages for each day), and I took all of the dates from January through May (at the time I did this, that was the current date). I wrote down, on each sheet, something I said or promised someone I would do. Some of them were trivial, some were more involved, some were big ticket items like researching an entire series of blog posts or working through a full course of study for a programming language. As I started jotting them down, I realized that each time I wrote one down, another one popped into my head, and I dutifully wrote that one down too, and another, and another, until I had mostly used up the sheets of paper.

WOW!!!

I came to the conclusion that I would have to do some drastic time management to actually get through all of these, and part of that was to find out where I actually spent my time and how much time it took to actually complete these tasks. I also told myself that, until I got through a bunch of these, I was going to curtail my blog writing until they were done. I've often used my blog as a "healthy procrastination", but I decided that, unless I was discussing something time sensitive or I was at an event, the blog would have to take a back seat. That's the long and short of why I have written so little these past three months.

In addition, I came to a realization that matched a lot of what I had been reading about multi-tasking and effectively transitioning from one task to another. For every two tasks I tried to accomplish at the same time, I came to see that the turnaround time to getting them done, compared to doing them independently, was four hours above and beyond what it would take to do those tasks individually. That was at the absolute best case scenario, with me firing on all cylinders, and me in "hot mode" brain-wise. As I've said in the past, to borrow from James Bach, my brain is not like a well oiled machine. Instead, it's very much like an unruly tiger. I can have all the desire in the world, and all the incentives to want to get something done, but unless "the tiger" was in the mood, it just wasn't going to be a product I, or anyone else, would be happy with.

The areas and stimuli that had the best effect was an absolute drop dead date, and another person in need of what I was doing to make it happen. Even then, I found myself delivering so close to the drop dead date that it was making both myself and the people I was collaborating with anxious.

Frankly, that's just no way to live!

Next week is CAST. I am excited about the talk I am delivering. It's about mentoring, and using a method called Coyote Teaching, along with the rich (but often expensive) nature in which it allows for not just transfer of skills, but also truly effective understanding. In this process of writing and working on this talk with my co-presenter, Harrison Lovell, I decided to use it on me, a little bit of "Physician, heal thyself". I came to realize that my expectational debt was growing out of control again. In the effort to try to please everyone, I was pleasing no one, least of all myself. Additionally, I have been looking at what the next year or so will be shaping up to look like, where my time and energy is going to be needed, and I came to the stark realization that I really had to cut back my time and attention for a variety of things that, while they sounded great on the surface, were just going to take up too much time for me to be effective.

I've already conversed with several people and started the process of tying up and winding down some things. I want to be good to my word, but I have to be clear as to what I can really do and what time I actually have to do those things. Time and attention are finite. We really cannot make or delay time. No one has yet to make the magic device from "The Girl, The Gold Watch and Everything", and time travel is not yet possible. That means that all I can do is use the precious 24 hours I get granted each day to meet the objectives that really matter. That means I really and honestly have to exercise the muscles that control the answer "NO" much more often than I am comfortable with doing. I have to remind myself that I would rather do fewer things really well than a lot of things mediocre or poorly.

I am appreciative of those who have willingly and understandingly helped by stepping in and taking over areas that I needed to step back from. Others will follow, to be certain. For the most part, though, people are actually OK with it when you say "NO". It's far better than saying "YES" and having that yes disappear into a black hole of time, needing consistent prodding and poking to bring it back to the surface.

I still have some things to deliver, and once they are delivered, I'm going to tie off the loose ends and move on where I can, hand off what I must, and focus on the areas that are the most important (of which I realize, that list can change daily). Here's looking to a little less cluttered, but hopefully more focused and effective few months ahead, and what I hope is also a more regular blog posting schedule ;).

Friday, February 14, 2014

Book Review: The Modern Web

I am zeroing in on clearing out my back log of books that came with me on my flight to Florida. I have a few more to get through, some decidedly "retro" by now, and a few that some might find amusing. NoStarch publishes "The MangaGuide to..." series, and I have three titles that I'm working through related to Databases, Statistics and Physics. Consider these the "domain knowledge in a nutshell books", and I'll be posting them in a couple of weeks). With that out of the way ;)...

The web has become a rather fragmented beast these past twenty some odd years. Once upon a  time, it was simple. Well. relatively simple. Three-tiered architecture was the norm, HTML was blocking, some frames could make for structure, and a handful of CGI scripts would give you some interactivity. Add a little JavaScript for eye candy and you were good. 

Now? there’s a different flavor of web framework for any given day of the week, and then some. JavaScript has grown to the point where we don’t even really talk about it, unless it’s to refer to the particular library we are using (jQuery? Backbone? Ember? Angular? All of the above?). CSS and HTML have blended, and the simple structure of old has given way to a myriad of tagging, style references, script references, and other techniques to manage the miss-mash of parts that make up what you see on your screen. Oh yeah, lest we forget “what you see on your screen” has also taken on a whole new meaning. It used to mean computer screen. Now it’s computer, tablet, embedded screen, mobile phone, and a variety of other devices with sizes and shapes we were only dreaming about two decades ago.

Imagine yourself a person wanting to create a site today. I don’t mean going to one of those all-in-one site hosting shops and turning the crank on their template library (though there’s nothing wrong with that), I mean “start from bare teal, roll your own, make a site from scratch” kind of things. With the dizzying array of options out there, what’s an aspiring web developer to do?

Peter Gasston (author of "The Book of CSS3”) has effectively asked the same questions, and his answer is “The Modern Web”. Peter starts with the premise that the days of making a site for just the desktop are long gone. Any site that doesn’t consider mobile as an alternate platform (and truth be told, for many people, their only platform) they’re going to miss out on a lot of people. therefore, the multi platform ideal (device agnostic) is set up front and explanations of options available take that mobile-inclusive model into account. Each chapter looks at a broad array of possible options and available tools, and provides a survey of what they can do. Each chapter ends with a Further Reading section that will take you to a variety of sites and reference points to help you wrap your head around all of these details.

So what does “The Modern Web” have to say for itself?

Chapter 1 describes the Web Platform, sets the stage, and talks a bit about the realities that have led us to what I described in the opening paragraphs. It’s a primer for the ideas that will be covered in the rest of the book. Gasston encourages the idea of the "web platform” and that it contains all of the building blocks to be covered, including HTML5, CSS3 and JavaScript. It also encourages the user to keep up to date in the developments of browsers, what they are doing, what they are not doing, and what they have stopped doing. Gasston also says “test, test, and then test again”, which is a message I can wholeheartedly appreciate.

Chapter 2 is about Structure and Semantics,  or to put a finer point on it, the semantic differences available now to structure documents using HTML5. One of them has become a steady companion of late, and that’s Web Accessibility Initiatives Accessible Rich Internet Applications or WAI-ARIA (usually shortened to ARIA by yours truly). If you have ever wanted to understand Accessibility and the broader 508 standard, and what you an do to get a greater appreciation of what to do to enable this, ARIA tags are a must. The ability to segment the structure of documents based on content and platform means that we spend less time trying to shoehorn our sites into specific platforms, but rather make a ubiquitous platform that can be accessed depending on the device, and create the content to reside in that framework.


Chapter 3 talks about Device Responsive CSS, and at the heart of that is the ability to perform “media queries” what that means is, “tell me what device I am on, and I’ll tell you the best way to display the data.” This is a mostly theoretical chapter, showing what could happen with a variety of devices and leveraging options like Mobile first design. 

Chapter 4 discusses New Approaches to CSS Layouts, including how to set up multi column layouts, taking a look at the Flexbox tool, and the way it structures content, and leveraging the Grid layout so familiar to professional print publishing (defining what’s a space, where the space is, and how to allocate content to a particular space). 

Chapter 5 brings us to the current (as of the book writing) state of JavaScript, and that today’s JavaScript has exploded with available libraries (Burgess uses the term “Cambrian” to describe the proliferation and fragmentation of JavaScript libraries and capabilities). Libraries can be immensely useful, but be warned, they often come at a price, typically in the performance of your site or app. However, there is a benefit to having a lot of capabilities and features that can be referenced under one roof.

Chapter 6 covers device API’s that are now available to web developers thanks to HTML5, etc. Options such as Geolocation, utilizing Web storage, using utilities like drag and drop, accessing the devices camera and manipulating the images captured, connecting to external sites and apps, etc. Again, this is a broad survey, not a detailed breakdown. Explore the further reading if any of these items is interesting to you. 

Chapter 7 looks at Images and Graphics, specifically Scalable Vector Graphics (SVG) and the canvas option in HTML5. While JPEG’s, PNG’s and GIF’s are certainly still used, these newer techniques allow for the ability to draw vector and bitmap graphics dynamically. Each has their uses, along with some sample code snippets to demonstrate them in action.

Chapter 8 is dedicated to forms, more to the point, it is dedicated to the ways that forms can take advantage of the new HTML5 options to help drive rich web applications. A variety of new input options exist to leverage phone and tablet interfaces, where the input type (search box, URL, phone number, etc.) determines in advance what input options are needed and what to display to the user. The ability to auto-display choices to a user based on a data list is shown, as are a variety of input options, such as sliders for numerical values, spin-wheels for choosing dates, and other aspects familiar to mobile users can now be called by assigning their attributes to forms and applications. One of the nicer HTML5 options related to forms is that we can now create client side form validation, whereas before we needed to rely on secondary JavaScript, now it’s just part of the form field declarations (cool!).

Chapter 9 looks at how HTML5 handles multimedia directly using the audio and video tags, and the options to allow the user to display a variety of players, controls and options, as well as to utilize a variety of audio and video formats. Options like subtitles can be added, as well as captioned displayed at key points (think of those little pop-ups in YouTube, etc. yep, those). There are several formats, and of course, not all are compatible with all browsers, to the ability to pick and choose, or use a system’s default, adds to the robustness of the options (and also adds to the complexity of providing video and audio data native via the browser). 

Chapter 10 looks at the difference between a general web and mobile site, and the processes used to package a true “web app” that can be accessed and downloaded from a web marketplace like Google Store. In addition, options like Phonegap, which allows for a greater level of integration with a particular device, and AppCache, which lets a user store data on their device so they can user the app offline, get some coverage and examples.

Chapter 11 can be seen as an Epilogue to the book as a whole, in that it is a look to the future and some areas that are still baking, but may well become available in the not too distant future. Web Components, which allows for blocks to be reused and enhanced, while being in a protected space from standards CSS and JavaScript. CSS is also undergoing tome changes, with regions and exclusions allowing more customizable layout options. A lot of this is still in the works, but some of it is available now. Check the Further Reading sections to see what and how far along.

The book ends with two appendices. Appendix A covers Browser support for each of the sections in the book, while Appendix B is a gathering of chapter by chapter Further reading links and sources. 

Bottom Line:


The so called Modern Web is a miss mash of technologies, standards, practices and options that overlap and cover a lot of areas. There is a lot of detail crammed into this one book, and there’s a fair amount of tinkering to be done to see what works and how. Each section has a variety of examples and ways to see just what the page/site/app is doing. For the web developer who already has a handle on these technologies, this will be a good reference style book to examine and look for further details in the Further Reading (really, there’s a lot of “Further Reading that can be done!). 

The beginning Web Programmer may feel a bit lost in some of this, but with time, and practice with each option, it feels more comfortable. It’s not meant to be a HowTo book, but more of a survey course, with some specific examples spelled out here and there. I do think this book has a special niche that can benefit from it directly, and I’m lucky to be part of that group. Software Testers, if you’d like a book that covers a wide array of “futuristic” web tech, the positives and negatives, and the potential pitfalls that would be of great value to a software tester, this is a wonderful addition to your library. It’s certainly been a nice addition to mine :). 

Sunday, January 26, 2014

Retro Book Review: Rethinking Expertise

When  I was in Sweden, back in November 2013, I sat down to a talk with James Bach and we got into a discussion about how to categorize ideas and thoughts as to what informs our testing and what key features we’ve brought to our respective games. He said he noticed that I’d read a great deal, and that he was curious as to what I was reading and why. As I was explaining the books I was reading, and why I found them valuable, he kept coming around to the question “that’s great, but how have you applied this to your testing?”, and I realized that, frankly, I was struggling with answering that. For the life of me, I couldn’t explain why I was having trouble. I knew they were helpful, and I knew that I was taking ideas left and right and applying them, so why was I struggling with this so much?

James pointed out that one of my challenges was that I was missing some key ideas as to how to position my own expertise and what I actually knew, versus what I thought that I knew. He also counseled me that, in some ways, because of working with certain domains, and doing so for some time, I may be equating experiential expertise with contributory expertise, and that the two, though we might want to believe are equivalent, really aren’t. With that, James handed me “Rethinking Expertise” by Harry Collins and Robert Evans and said “I think you might find this book helpful”. 

For those with a short attention span, I’ll save you a bunch of time… yes, indeed, it was! If you want to have an opportunity to take a deep dive into the idea, ideals and avenues of expertise, what makes an expert an “expert”, and how fluid and fraught with controversy that term actually is, this book is a must read.

For those with a greater tolerance for my wordy reviews, let’s start with the centerpiece of the book, which is the Periodic Table of Expertises. This is going to be a short description of the table, not a full breakdown of every element. The explanation of the table and the elements in it take up over 50% of the total book.

We start at the highest level with Ubiquitous Expertise. This is the stuff each of us does automatically. When we walk, listen, speak a language, ride a bicycle, swim, or perform a physical task like reading, writing, or using a computer,  if we take the time to try to explain what we are doing, exactly what we are doing, we may well find that words fail us. Why? Because these are skills we take for granted, i.e. they are ubiquitous expertises. They are part of our “bare metal” programming, of sorts. 

As we step further away from the ubiquitous aspects of expertise, we get into the areas where we have some control over how we describe them, because they are actively learned and nurtured in a way that we understand active learning and nurturing. Facts, figures, trivia, arcana, minutiae, story plots, and other gained “chops” in a given area all fall into this sphere. These experiences can be split up into two areas (Ubiquitous Tacit Knowledge and Specialist Tacit Knowledge). It’s in the tacit knowledge areas that most of us “think we know what we know”, but may still have trouble verbalizing the depth or significance of what we know. This exists on a continuum: 

  • simple fact acquisition (think of remembering a fact from a Trivial Pursuit game you participated in, and knowing that fact simply because you played that game)
  • popular knowledge (you heard about it on the news a bunch of times, so you feel like you know the topic well)
  • primary source material (you bought and read a book on a topic)
  • interactive expertise (in my world, I am a software tester, so I have a fairly good grasp of the parlance of software programming, and can talk a mean game with other programmers, but I hardly consider myself a “programmer” in a professional sense, though I could “play one on TV”)
  • contributory knowledge (I certainly can talk with authority about the programming I have personally done, as well as several challenges and pitfalls I have personally experienced. My experiences with software testing also falls squarely in this area). 

At the higher levels we get to even more specialized expertise, some of which are directly in line with contributory expertise, and some that are interactive expertise, at best. These are called the meta-expertises and meta-criteria, and we can include the pundit, the art, movie & restaurant critic, and in some cases, the general public.

While all of these classifications are interesting, they show how expertise is fluid, and, also, that it can be faked. Very convincingly so, in some cases. What makes an expert an “expert" is often in the eye of the beholder. Collins and Evans take three case studies, those using color-blind individuals, pitch-blind individuals (those who do not have “perfect-pitch” capabilities, which admittedly is a very large percentage of the population), and Gravitational Wave scientists. In this section, they set up experiments similar to a “Turing Test” (referred to in the book as the "Imitation Game") to see if people who do not have knowledge of a particular area (meaning those who are not color-blind, pitch-perfect, or don’t really know about Gravitational Wave science) can be fooled by people who can “talk a mean game” about a particular area, but don’t really have that experience. Conversely, the same experiment was done with those who did have experience with these areas (those who are color-blind, pitch-perfect and have experience with Gravitational Wave science). The results showed that those who were genuine experts could spot fakes most of the time (though sometimes could be tricked), and were “conned” a lot less often than those who didn’t have background with these areas. 


These case studies set up the remainder of the book, in which we look at a variety of demarcation points as to where we might want to use some greater discrimination when it comes to just how much we trust certain “experts”. Collins and Evans explore a variety of intersections, such as science vs. art, science vs. politics, hard sciences vs. social sciences, and science vs. pseudo-science. In several cases, we as everyday people find the point where we accept “expertise” moves along a continuum from the original producers of knowledge (the hard sciences) to those who consume knowledge (the arts, but also politics, the social sciences and the pseudo-sciences). Why do we give the pundit, the news anchor, the talk show host, or the pop culture critic credence? Why do we trust their “expertise”? What is it based on? In short, are we being conned?

The book closes with an Appendix on the Three Waves of Science, or perhaps better placed, the three waves of scientific inquiry as relates to the Twentieth and Twenty-first centuries. We have moved the pendulum away from the idea that scientists are "rarified creatures" on the level of high priests of the technical sphere (the first wave), to a time of great distrust in science and relativistic views as to their value and relevance (the second wave), and now to a third wave that is less reverential of the first wave, but more skeptical of the claims of the second wave.

I’d encourage any reader of Rethinking Expertise to read the Introduction and this Appendix first, and then read the rest of the book in order. By doing so, the scaffolding of the ideas being presented makes more initial sense, and will potentially prevent the need for a re-read like I needed (though frankly, that reread may prove to be very insightful). This is not a casual Saturday afternoon read (though it may be for some, it certainly wasn’t for me). This is a dense book, and the sheer quantity and textual volume of the footnotes is significant. Rereads will certainly give further clarification and a better feeling for the ideas.


Bottom Line:

Rethinking Expertise meets many objectives. First, it gives a taxonomy to areas of expertise, and helps solidify an understanding as to where on the continuum our understanding comes from, the level to how (and why) we understand what we do. It also helps us identify how our interactions and experiences, along with direct participation in events and activities, all contribute to the level of expertise that we have (or don’t have). What’s more, it helps us to get a handle on the level of expertise others may have, or may not have. It asks us to think critically about those we trust, and what their intentions may be.

Wednesday, December 18, 2013

On Surroundings, Clutter and Getting Out of My Own Way

One of the things that I have found over the past several years, and it seems I am destined to ever relearn this, is that I can always find a way to get in my own way. I am a fan of classification, organization, attempts to streamline and de-clutter, yet I always seem to have too much stuff surrounding me. Note, this is not a complaint, but an observation. I love the home I live in. It's not opulent, not hugely spacious, but it serves the needs of a family of five quite nicely. The one drawback, at least for me, is that there has to be someplace where chaos can exist so that order can take place elsewhere. Ironically, those places seem to be my garage and my office (i.e. my two "creative domains").

There is a joke among "organizistas" that allows for the "ten percent rule". No space is ever completely organized. There is always some place where there is chaos and disorganization. Every immaculate kitchen has a junk drawer. There is one closet that is not perfectly organized, and one room tends to be a jumble of things. Often, it is because the activity or purpose of those areas is perpetually labeled "miscellaneous". It's where all of the odds and ends tend to end up. My office gets this designation.

Understand, my office is the most "unlovable" room in our house. For various reasons, the previous owners of our home, when they chose to build an upstairs addition over the garage, decided to route out a section of the garage to put the staircase. They made a large main room, a half bath, and a "bedroom". Well, some room had to go over the staircase, and to be creative about it, the "bedroom" was the spot chosen to be over the stairwell.

Were the room to be a perfect rectangle, it would be nine feet nine inches wide by thirteen feet six inches long, with five and a half feet by two feet taken up by closet space. But this isn't that room. Instead, there is "the bump", four feet six inches by four feet four inches, that goes over the stairs. This unfortunately placed bump makes the room a bit, shall we say, less than ideally shaped. To the previous owners credit, they did make some interesting use of the space and made an additional closet out of it, albeit a closet with a floor that rises above the floor of the main room.

There were a few owners that owned this house before we did, and everyone who owned the house before us used this room for the same purpose. It was "the office" or "the spare room", because it did not fit any pre-conceived plan to be used as a regular bedroom. Feng Shui was not taken into account when this room was made ;). Thus, I tended to do the same, and for fifteen years, this room has undergone several transformations, purges, redesigns, clutter abatements, organizations, reorganizations, and somehow, I still manage to get work done in here.

The room has to serve several purposes, mainly because there's no place else to practically have me do these things. Not only is my office my work and testing space, my writing haven, my exercise room, my craft room, and my recording studio, in a pinch, or a late night or early morning call or work assignment space needed, it doubles as a meeting room, a lounge, a reading area, and occasionally, a guest bedroom (and likewise, on occasion, "my" bedroom, when I find it late and I just need to get a couple hours rest before I get up and go to it again).

The biggest challenge with a multi-purpose room is the need to switch tasks, and have a system where I can do so effectively. "A place for everything and everything in its place" is great in theory, but very often, I find myself having to contend with two or three projects needing the same space. thus, what often happens is there is a bundling up of stuff, stuffing wherever I can make room, and then getting back to it when I can. It's a room that invites fiddling, tweaking and moving stuff around to find that "best spot" to put everything. In short, it's a place where, very often, I find that I get in my own way; my best laid plans for one project/process cause me to be horribly inefficient for another.

In an ideal world, I would just say "OK, well, I will set up another room to do that other thing", but that's not really an option. To keep a happy home, this room is my domain for whatever project I need to be working on, and as such, it has to work, odd and ends and all.

I liken this to my testing career. Too often, people on the outside look at testing as though it's a neat little atomic and simple process. You look at something, you expect a behavior, you confirm the behavior, it passes. If it doesn't confirm the behavior, it fails. It limit testing to a very simple yes or no system that is efficient, elegant, and easy to organize. Hate to be the bearer of bad news, but testing is none of those things. It's wildly cross-discipline, it takes from many places, it needs many unique resources, some of which are really hard to categorize.


Testers don't just have a small set of tools at their disposal. In fact, they literally have the entire world of knowledge to work with to help define their tests, their strategy and their approach. When you picture a heavily cluttered and wildly akimbo lab of a mad scientist, that is the essence of a tester, and the way a tester often works.

Does this mean that we cannot be organized, that we cannot be efficient? Of course not, but it does mean that we run the risk of getting in our own way. We are often enticed by some new tool, app or device that will make things simpler, more elegant, less crowded, more organized and methodical. Unfortunately, I tend to look at testers (not all, mind you) as those people that fall under the "ten percent chaos rule". We have needs for more than the "nicely organized" environment that fits one purpose very well. We shift, we switch, we weave and bob through ideas and applications, and sometimes, that requires a tolerance for "stuff and clutter" that goes beyond what many think is comfortable.


As of now, I have some semblance of control over my domain, but alas, this is a good day. Some days are more chaotic than others. In my world, the best thing I have discovered is that "nothing stays the same, and nothing ever changes unless there's some pain involved". That doesn't mean that I don't keep trying to organize and get everything where I want it to be. It means that I do my best to rid myself of distractions until I actually need them, and then know where those distractions have gone. Once I know where they are, I try my mightiest to forget they are there, at least for the time being. Ultimately, I have to be aware of the fact that, most of the time, the person that gets in the way of my progressing on something, ultimately, is me. Thus, it is best to do whatever it takes to get me out of my own way whenever possible :).




Wednesday, December 4, 2013

Live from Climate Corp, it's BAST!!!

Another day, another Meetup, and this time, I don't have to present, so I can do what I usually do at these events, which is stream whatever flows into my brain and capture on bits to share with the rest of you.


Again, as in all of these events, this will be raw, dis-jointed and potentially confusing, but the benefit is you get to hear it here and now as I hear it (well, for the most part). If you want fully coherent, wait a while ;).



So for those curious, here's where we are at tonight, and what we are covering:



"Productivity Sucks, How About Being Effective" an evening with Jim Benson

Wednesday, December 4, 2013

6:00 PM - 8:30 PM

The Climate Corporation

201 3rd Street #1100, San Francisco, CA


Jim Benson, the co-author of "Personal Kanban", and a contributing author of "Beyond Agile: Tales of Continuous Improvement", is here to talk about  about the myths surrounding our work and how we think of it, specifically around how we determine what is productive, and isn't. Tonianne DeMaria Barry, the co-author of "Personal Kanban" (and his partner at Modus Cooperandi) will also be sharing some of her experiences from a variety of successful "kaizen" camps that have been held around the world.

What we are hoping to do with tonight's talk (and several more in the future) is to expand the range of topics that get covered in a typical software testing Meetup. Our goal is to help develop a  broad cross section of skills for testers, not just those in the nuts and bolts of direct and specific testing skills, or programming/toolsmith topics (nothing wrong with those, of course, we have them, too).

At the moment, though, we are eating food, drinking beer, wine and soft drinks, and conversing. Thus, I feel it vital to schmooze and welcome our guests, but I will be back shortly ;).

---

Jim stated our talk tonight about how he was able to set up a variety of opportunities selling Agile Methodologies to organizations (businesses, government, etc.), and realizing that many of the Agile Methodologies were, well, problematic. While working through some of the issues, they opted to try to apply Lean principles and, in the process, developed a variety of methods around a "kanban" system ("kanban" being a Japanese term that means "ticket" or "the card that moves"). Anyway, that's Jim, and that's what he said he wanted to get out of the way right now.


What he really wants to talk about is the fact that, if you work for a team or company that prides itself or markets itself as a "highly productive team", it's very likely that you are working in the worst environment possible. Wait, what?! Why would that be the worst possible things?


Part of the reason is that with that "high productivity" comes a lot of gamesmanship. It's also incredibly subjective; productive according to whom? Do they mean the team as a whole,? Do they mean the development methodology? Do they mean how much they push out? Who is defining or describing the "productivity"? We love to believe that everyone is all on the same page at the same time, and everyone is working in tandem.


Anyone who is in testing knows that this is rarely the case, unless you are fortunate to have the opportunity to pair with the developers as they code and you are riding along as a testing navigator (and yes, I do that from time to time, but not always, and not nearly as often as I would like to).  More times than not, we get our stories dropped in a group at the end of the coding time, and testing then spins up and frantically tries to get the testing done.


Jim makes the point that we are seeing an increase of productivity in some aspects, but we are seeing a proportional decrease in actual effectiveness, because much is getting done, but little is being accomplished (or it's overloading the downstream parts of the process, i.e. those of us in testing or ops).

So how can we solve this? First is recognize that productivity silos exist, and that they are evil. The more functionality that is sandwiched into one role, regardless of how productive they are, they are not going to be able to increase the entire teams ability to produce, release, or deploy because while one group is hyper-optimized, other groups are woefully under-prepared and over-burdened, because they do not have a complementary option. think of trying to fit a 12 foot diameter water pipe and its flow through a connecting pipe that is only three feet in diameter. Doesn't matter how much you put into the 12 foot pipe, the three foot pipe is going to be a bottleneck.

Think DevOps... and before anyone thinks "DevOps" as a team, it's not. It's a mindset and an approach. The goal, though, is that all of the teams need to be able to get the optimization that the programming group has. For the effectiveness of such a team to get better, all of the connection points need to be addressed, and all of the players need to be on the same page. That could mean many things. Sometimes it means that some of the programmers are going to be up in the middle of the night when something goes wrong ;). Silos are easy to talk about, but very hard to optimize and balance in reality.

"Productivity is just doing lots of stuff". Actually, James used a more colorful metaphor, but you get the point ;). Bad productivity is a reality. Lots of stuff is getting done, but it is really worthwhile? Is the chase for the almighty "velocity" really a worthwhile goal? Are we actually adding to the value of what we are creating? Or are we creating technical, intellectual, and expectational debt?

One of the goals behind Personal Kanban is to make it a pull system, where we grab what we can work on as soon as it's available. There are a variety of impulses that drive this, Demand pull, of course, is that the market is telling us "hey, we want this, please make it for us". Internal pull is when our internal voices of our companies are saying "we need this and fast" without any correlation to what our customers want. Aim for the former, but let's do all we can to resist the latter.

One of the real challenges of what we produce in a software centric world is that our "product" is extremely ephemeral. What we produce is difficult to visualize. Because of that, we have to be mindful of exactly what we are making, how that stream is created, and what has to happen from start to finish. The tricky part is that, more times than not, the initial value stream starts from the end and works its way backwards. What results, very often, is that we are missing stuff, and we don't know what we are missing, or why. If we are focused on productivity, we do not have any incentive to seek out the holes that exist. To be effective, we need to be aware, as much as possible, where the deep holes might actually be, and find them. Quickly is best :).

One of the concepts that we bring with Kanban, and that comes from the world of manufacturing, is the idea of "inventory". The goal is to have the right amount of materials on hand, but to not have too much inventory. Think of an auto manufacturer that produces tens of thousands of cars, only to discover that the market has no use for their cars, and therefore, they have produced tens of thousands of cars that no one wants. We'd see that as suicidal. Well, that works with code, too. When we write code that is bloated, filled with features that no one really wants, we are doing the same thing. We don't want to write a lot of code, we want to write the right code, an make sure that the right code WORKS!!! Note: this is not an Agile thing, or a Lean thing, or a Waterfall thing. It's a human thing, and we all feel its effects.

OK, so we have your attention. Productivity bad, effective good. That's great, but how can we use this in our testing? How can we recalibrate ourselves to a mindset of effectiveness? The most difficult thing is that, when we find that we have a back log of done things, rather than pushing those downstream to work harder and process more, we need to actually stop doing things and examine issues that might be "done" and see if "indeed" it really is (think of a story delivered two weeks ago, but that testing just got to today, and in the process, they found a bug. What does a programmer do? Well, very likely, they have to go back two weeks in memory to think about what they actually did, and that context switch is expensive. Compare that to programming happening and testing happening in a tight feedback loop. Which approach do you think will be more effective? Why?


Chances are, because if we can address something shortly after we worked on it, we would be fresh and remember where we were, what we were doing, and how we might be able to address it. Delays make that process much harder. Therefore, anything that creates a delay between Done in one sphere and Done in another will cause, surprise, more delays! the best thing we can do is build a narrative of our work that is shared and that is coordinated. If we realize that we have ten stories in backlog, the best thing we could do is stop adding to the back log.

Jim brought up the Buddhist concept of "mindfulness", and the ability to be mindful of the shared story comes to the fore when we are not overloaded and focused on production at the expense of everything else. Avoid becoming task focused, aim to be situationally aware. Specifically look for opportunities to collaborate. That doesn't necessarily mean "pair programming" (that might be a better be described as "teaming"), but it does mean try to find ways that we can leverage what each other is doing, not just to get stuff done, but to more likely ensure that we are doing the right things at the right time, in the most effective way.


One very special aspect that needs to be addressed... there are things that are just plain "hard". Very often, the way we deal with hard problems is we move over to easier things to work on, and when we get them done, we feel accomplished. It's human nature, but it doesn't address the fact that the hard stuff is not getting dealt with. the dancer is that, at the end of the sprint/iteration/cycle/etc. all that's left is the hard stuff, and now we have a danger of not being able to deliver because we didn't address the hard stuff until we reached a point of no return. when we get to things that are complex, it doesn't necessarily mean that we have failed. It may just mean that we need more than one person to accomplish the task, or that we need to have a more thorough understanding of what the actual goal is (it may be vital, or it may not even be necessary, but we will not know until we dig into it, and the longer we wait to make that dig, the less likely we will get a clear view of where on the spectrum that "hard problem" lies.

Back to testers in this environment. How can we help move towards effectiveness? Put simply, get us involved earlier, and have us test as soon as humanly possible. We don't have to test on finished code. we can test on requirements, on story points, on initial builds to check proof of concept. Sure, it's not finished, it's not elegant, but we don't really care. We don't want to polish, we want to check for structural integrity. Better to do that as early as humanly possible. We can focus on polish later, but if we can help discover structural integrity issues early, we can fix those before they become much mode difficult (and time sensitive) later.

---

My thanks to Jim and Tonianne for coming out tonight to speak with us, and thanks for helping us cover something a bit different than normal. I enjoyed this session a great deal, and I hope we can continue to deliver unique and interesting presentations at BAST Meetups. My thanks also to Curtis and Climate Corporation for the space and all the leg work for making this happen. Most of all, thanks to everyone who came out to hear tonight's talk. You are making the case for us that there is definitely a need for this type of information and interaction. Here's wishing everyone a Happy Hanukkah, Wonderful Winter Solstice, Merry Christmas, Happy New Year an any other holiday I might be overlooking, but we wish you the best for the rest of this crazy active month. We look forward to seeing you all again sometime in mid January, topic to be determined ;).