Friday, January 31, 2014

The State of Software Testing: A Follow-up and My Commentary

For those who may remember, back in December 2013, I encouraged as many testers as possible take part in the "State of Testing" survey being sponsored by Tea Time With Testers, and being collated and curated by Joel Montvelisky. Well, that survey has been completed, and for those interested in seeing all of the results, you can download it from here.

This is, of course, a self selecting group, so the views expressed may or may not be indicative of the broader software testing world, but it does represent the views of those who responded, including me (ETA: or at least intends to; see James comment below). Now that the survey is public, I'm going to share my answers (with some qualifying statements where relevant) and examine how my answers map to the overall report.

First, some caveats. Any survey that attempts to boil things down into data points will "lose something in the rinse cycle". There were a lot of questions where "well, sometimes" and "hmmm, not so often, but yes, I do that from time to time" colored the answers. My clear takeaway from all of this is that I realized I am not "just a software tester". I do a lot of additional things as well. I program (for some definition of programming). I write. I lead. I maintain and build infrastructure. I talk with and advise customers. I build software. I deploy releases. I hack. I do a little marketing here and there. I sell, sometimes. I play detective, journalist, and anthropologist. Much of this will not show up in this survey, but my guess is, a lot of you who answered probably do much the same, and some of that may not be reflected in this survey, either.

First of all, where do I rate in the hierarchy? For years, I was a lone gun, so had you asked me in 2012, I would have said Tester, Test Manager and Test Architect all in one. Today, I am part of a team of testers (most of us with two and three decade long track records). I'm definitely a senior, but I'm at peer level with just about everyone on my team. From time to time we bring in interns and junior team members where I get to mentor them, but much of the time, it's just us. While we have our different approaches and attitudes, I'm confident to say we balance each other well. Sometimes I'm the lead, sometimes I'm led. For us, it works.

Our test team, at this moment, has five people. One is our Test Director, three are Senior level Software Testers (me included), and one is a contractor whose sole responsibility is writing test automation. Our Test Director's title is mostly ceremonial; the four of us all work together to divvy up stories and utilize our expertise, as well as share that expertise with others to broaden our abilities. We do have "personal preference" silos. Our director likes doing a lot of the automation and rapid response stuff. One of our testers has a special knack for mobile testing. Another tester has a great feel for the security side of things. I tend to be the first in line for the architectural and back end stories. During crunch time, it's not uncommon to see the story queue aligning with our personal preference silos; hey, we know what we are good at ;). However, we do take the time to cross train, and all of us are capable, and becoming more so each day, to venture into other avenues.

Our Engineering team, including us testers, at this moment, is fourteen people. That puts a balance of roughly two testers to one programmer, the smallest ratio I've had to date in any organization I've worked at. Of course, for several years, I was the only tester, in organizations with ten to fifteen programmers. This has helped us considerably in that, at any given time, each software tester typically has two stories in play at any given time, possibly three if we count self-directed spikes for learning or infrastructure improvements.

Like most of the respondents, we have an Agile-like team (we bend the "rules" a little here and there, but as far as the core principles of the Agile Manifesto, I think we do pretty well). We have both a co-located and a distributed presence, so being able to communicate quickly is an imperative. We do a lot with shared screens, soft phones in our computers, and a set of IRC channels that sees a lot of traffic. We use our own product as our primary platform for doing as much of our business as possible. If it can't be done in Socialtext, we either find a way to do it there, or seriously consider not doing it at all. Our IRC server is the key exception. That's so we have a means to communicate and stay productive if the worst case scenario happens, and we lose our work environment (hey, it pays to be paranoid ;) ).

Each of us on the team wears different hats when we approach our software testing jobs. We are involved very early with requirements gathering (we practice the Three Amigos approach to story design and development). We all take turns along with the rest of the engineering team with core company functions. Each tester takes a week where they are build master and deployment specialist for our operations environment. Each of us manages a pool of development servers. Each of us is versed in the inner working of Jenkins, our Continuous Integration server, and each of, to varying degrees, writes or enhances test automation, utilize testing tools from a variety of initiatives, and do our best to be "jacks of all trades" while still holding on to our own "personal preference" silos.

We use a broad variety of testing techniques in our team. Exploratory Testing is championed and supported. Our developers use TDD in their design approach. They occasionally perform pair programming, though not exclusively. I actively encourage pair testing, and frequently coordinate with the programmers to work alongside them. Usually this is during the active testing phase, but at times during the programming stage, where I act as navigator and ask a lot of "what if" questions. In short, we are encouraged to use our faculties to the best of our abilities, and to provide essential artifacts, but not become slaves to the process, which I greatly appreciate.

We do two stand-up meetings each day. The first is the Engineering team stand-up, and then we have a dedicated Q.A. team stand-up, led by our Test Director, or one of us if the Test Director is not available. In these meetings, we gauge our own progress as a QA organization, and we look for ways we can improve and up our collective game. Oftentimes that means cross-training, and also working as a team on individual time-sensitive and critical stories.

Out test documentation lives inside of the stories we create. Each of the acceptance criteria is spelled out in the story, our programmers check off when they have finished coding to the acceptance criteria , we check off when we have finished testing for each piece of acceptance criteria, and we use a Notes field for each item of acceptance criteria to describe findings or add additional details. The goal is to have the ability to show what we completed, communicate our findings, and be able to (in as few steps as possible) provide the insight and context necessary for the programmers to fix issues.

We have a vigorous and active automation suite. It's in many ways a home-brewed process. These automation tests run the gamut from unit tests, to full workflow functional tests, and everything in between. Our product is actually used to write our automation, store our automation, and is called by our testing framework to run our automation. We get very meta in our automated tests, and it's been that way for many years. We don't expect every tester be a programmer, but it certainly helps. At this point in time, all of our testers do some level of automation test creation and maintenance. As to the level of our automation, we have a mantra that a story is not finished unless it has both unit tests to cover the defined functionality and QA based automated tests to exercise the workflow. I will not say we have 100% automated coverage, but we have a high percentage of all of our workflows and features automated. 85-90% would be a reasonable guess.

Again, we have one contractor whose sole responsibility is to create automated tests, and the rest of us augment those efforts. Most of our automation is aimed towards a large regression test suite, and our automated tests are treated like source code, just as much as the actual program code. If Jenkins fails on an automated test QA has written, then the build fails, and the programmers need to fix the reason for the failure. If the test is seen as flaky, it's our responsibility as testers (and creators of the automated tests) to fix that flaky test. In addition, we also have a broad suite of tests that we use to help us with exploratory testing as well. Their purpose is tobring us into interesting areas of the code after performing a variety of state changes, and letting us, as active testers, hop off and see what we can find. We often refer to these tests as our "QA Taxi Service".

Our process of managing stories, bugs, feature requests, etc. is again unique to Socialtext. All of our reporting is performed within our product. We have created a custom Kanban board to run inside of Socialtext, and all artifacts for stories reside inside the Kanban. Actually, they reside inside of Socialtext. We have engineered it so that our stories can reference individual pages, charts, SOCIALCALC sheets (our own spreadsheet program that resides in Socialtext), pictures, videos, etc. We make the information available to all as quickly and efficiently as possible. We take what we learn and create how-to guides for everyone to use and share. These guides get updated regularly. Everyone on the team has a responsibility to "green" the how-to guides, and make sure that everyone knows what has changed, what has been learned, and how to use that information to program better and test better.

So that's how my team looks compared to the survey. How about me?

I'm on the high end for experience as far as the survey participants goes, but I'm middle of the road as far as my immediate team is concerned. Our Test Director has been a tester for three decades plus. Our core in-office team of testers are all roughly the same age, and have the same relative years of experience (two decades being the average among us, with just a few years up or down either direction individually). Our test automation contractor has roughly a decade of experience. Though we occasionally get interns and other junior staff, for the most part, we're a pretty seasoned team, and that's rather cool.

As to continuous learning, I use a variety of approaches to learn. Testing blogs, newsgroups, Twitter, various "social" groups that are formed with other people (Miagi-do, Weekend Testing, Meet-ups, etc.) all play into my approach, as well as active blogging of what I learn. My attitude is to learn by whatever means is available.

Looking into the future, I see that there are a lot of areas that I personally want to focus on. I want to get more involved in testing for security. Specifically, I want to get a better practical, nuts and bolts understanding of what that entails. I see a need to boost my performance chops. That means going beyond running a tool that simulates load. My guess is that I'll be going back and doing a lot of re-listening to PerfBytes in the coming weeks and months ;). Automation is perpetually "there", and while every year I say I'm going to get more involved, I finally have a team, and a scope of projects, where that's more than just wishful thinking. More than anything else, I want to see what I can do to find the bottlenecks in what I personally do, and figure out how to minimize them, if not completely eliminate them. I also want to explore ways that I can eliminate waste from the processes I already do, even if they are processes and methods that work pretty well.

As to job security, tomorrow is always subject to change (that whole "past performance is not an indicator of future results"), but I finally feel I'm at a place, and a level of involvement in the testing community, where my potential to find future jobs (should such a thing be necessary) is looking better now than it has at any time in my career history.

So there you go, the TESTHEAD "State of the Software Tester" for January 2014. Have a look at the report (posted here again for convenience ;) ) and ask yourself "where do I stand?". More important, ask yourself how you can strengthen your own "state", and after you figure that out, work on it. Then reach out and help someone else (or a lot of someone elses) get even better and go even farther than they dreamed possible.

Sunday, January 26, 2014

Retro Book Review: Rethinking Expertise

When  I was in Sweden, back in November 2013, I sat down to a talk with James Bach and we got into a discussion about how to categorize ideas and thoughts as to what informs our testing and what key features we’ve brought to our respective games. He said he noticed that I’d read a great deal, and that he was curious as to what I was reading and why. As I was explaining the books I was reading, and why I found them valuable, he kept coming around to the question “that’s great, but how have you applied this to your testing?”, and I realized that, frankly, I was struggling with answering that. For the life of me, I couldn’t explain why I was having trouble. I knew they were helpful, and I knew that I was taking ideas left and right and applying them, so why was I struggling with this so much?

James pointed out that one of my challenges was that I was missing some key ideas as to how to position my own expertise and what I actually knew, versus what I thought that I knew. He also counseled me that, in some ways, because of working with certain domains, and doing so for some time, I may be equating experiential expertise with contributory expertise, and that the two, though we might want to believe are equivalent, really aren’t. With that, James handed me “Rethinking Expertise” by Harry Collins and Robert Evans and said “I think you might find this book helpful”. 

For those with a short attention span, I’ll save you a bunch of time… yes, indeed, it was! If you want to have an opportunity to take a deep dive into the idea, ideals and avenues of expertise, what makes an expert an “expert”, and how fluid and fraught with controversy that term actually is, this book is a must read.

For those with a greater tolerance for my wordy reviews, let’s start with the centerpiece of the book, which is the Periodic Table of Expertises. This is going to be a short description of the table, not a full breakdown of every element. The explanation of the table and the elements in it take up over 50% of the total book.

We start at the highest level with Ubiquitous Expertise. This is the stuff each of us does automatically. When we walk, listen, speak a language, ride a bicycle, swim, or perform a physical task like reading, writing, or using a computer,  if we take the time to try to explain what we are doing, exactly what we are doing, we may well find that words fail us. Why? Because these are skills we take for granted, i.e. they are ubiquitous expertises. They are part of our “bare metal” programming, of sorts. 

As we step further away from the ubiquitous aspects of expertise, we get into the areas where we have some control over how we describe them, because they are actively learned and nurtured in a way that we understand active learning and nurturing. Facts, figures, trivia, arcana, minutiae, story plots, and other gained “chops” in a given area all fall into this sphere. These experiences can be split up into two areas (Ubiquitous Tacit Knowledge and Specialist Tacit Knowledge). It’s in the tacit knowledge areas that most of us “think we know what we know”, but may still have trouble verbalizing the depth or significance of what we know. This exists on a continuum: 

  • simple fact acquisition (think of remembering a fact from a Trivial Pursuit game you participated in, and knowing that fact simply because you played that game)
  • popular knowledge (you heard about it on the news a bunch of times, so you feel like you know the topic well)
  • primary source material (you bought and read a book on a topic)
  • interactive expertise (in my world, I am a software tester, so I have a fairly good grasp of the parlance of software programming, and can talk a mean game with other programmers, but I hardly consider myself a “programmer” in a professional sense, though I could “play one on TV”)
  • contributory knowledge (I certainly can talk with authority about the programming I have personally done, as well as several challenges and pitfalls I have personally experienced. My experiences with software testing also falls squarely in this area). 

At the higher levels we get to even more specialized expertise, some of which are directly in line with contributory expertise, and some that are interactive expertise, at best. These are called the meta-expertises and meta-criteria, and we can include the pundit, the art, movie & restaurant critic, and in some cases, the general public.

While all of these classifications are interesting, they show how expertise is fluid, and, also, that it can be faked. Very convincingly so, in some cases. What makes an expert an “expert" is often in the eye of the beholder. Collins and Evans take three case studies, those using color-blind individuals, pitch-blind individuals (those who do not have “perfect-pitch” capabilities, which admittedly is a very large percentage of the population), and Gravitational Wave scientists. In this section, they set up experiments similar to a “Turing Test” (referred to in the book as the "Imitation Game") to see if people who do not have knowledge of a particular area (meaning those who are not color-blind, pitch-perfect, or don’t really know about Gravitational Wave science) can be fooled by people who can “talk a mean game” about a particular area, but don’t really have that experience. Conversely, the same experiment was done with those who did have experience with these areas (those who are color-blind, pitch-perfect and have experience with Gravitational Wave science). The results showed that those who were genuine experts could spot fakes most of the time (though sometimes could be tricked), and were “conned” a lot less often than those who didn’t have background with these areas. 


These case studies set up the remainder of the book, in which we look at a variety of demarcation points as to where we might want to use some greater discrimination when it comes to just how much we trust certain “experts”. Collins and Evans explore a variety of intersections, such as science vs. art, science vs. politics, hard sciences vs. social sciences, and science vs. pseudo-science. In several cases, we as everyday people find the point where we accept “expertise” moves along a continuum from the original producers of knowledge (the hard sciences) to those who consume knowledge (the arts, but also politics, the social sciences and the pseudo-sciences). Why do we give the pundit, the news anchor, the talk show host, or the pop culture critic credence? Why do we trust their “expertise”? What is it based on? In short, are we being conned?

The book closes with an Appendix on the Three Waves of Science, or perhaps better placed, the three waves of scientific inquiry as relates to the Twentieth and Twenty-first centuries. We have moved the pendulum away from the idea that scientists are "rarified creatures" on the level of high priests of the technical sphere (the first wave), to a time of great distrust in science and relativistic views as to their value and relevance (the second wave), and now to a third wave that is less reverential of the first wave, but more skeptical of the claims of the second wave.

I’d encourage any reader of Rethinking Expertise to read the Introduction and this Appendix first, and then read the rest of the book in order. By doing so, the scaffolding of the ideas being presented makes more initial sense, and will potentially prevent the need for a re-read like I needed (though frankly, that reread may prove to be very insightful). This is not a casual Saturday afternoon read (though it may be for some, it certainly wasn’t for me). This is a dense book, and the sheer quantity and textual volume of the footnotes is significant. Rereads will certainly give further clarification and a better feeling for the ideas.


Bottom Line:

Rethinking Expertise meets many objectives. First, it gives a taxonomy to areas of expertise, and helps solidify an understanding as to where on the continuum our understanding comes from, the level to how (and why) we understand what we do. It also helps us identify how our interactions and experiences, along with direct participation in events and activities, all contribute to the level of expertise that we have (or don’t have). What’s more, it helps us to get a handle on the level of expertise others may have, or may not have. It asks us to think critically about those we trust, and what their intentions may be.

Thursday, January 23, 2014

Retro Book Review: Tribes

There’s nothing quite like a cross country flight to allow one to tune everything else out, calmly sit down, turn off the cares and distractions of the world, and plow through a much needed batch of learning and focus. Sure the Internet is wonderful. It’s always on (well, almost always). It's 100% available (again, almost always). It’s filled with every tantalizing distraction imaginable, and (almost) always there to keep you busy (for some definition of busy). I asked myself “what could I do if I deliberately avoided al of that? What would I focus on? What could I focus on?" The answer is reading. A lot of reading. My blog readers, for better or worse, are going to get a rather large deliverable from that this week, in that I have book reviews to represent. Some of these books are older, some are from unusual places and interests, and one of them was literally handed to me by a friend with the comment “I think you need this”… and to which I would say “that friend was absolutely right”, but more on that in another post ;).

First out of the gate is  Seth Godin’s “Tribes”.  The subtitle of the book handily describes the purpose and point of this title: "We Need You To Lead Us". For those who are fans of Godin, you already know what to expect, so I’ll make a quick, terse summary that seems to come up with every Godin book. It’s quick, it’s rambling, it’s all over the map, it’s passionate, and it is bedevilingly void of specifics (and yes, I realize that I just totally made up a word there, but work with me here).

Now, with the most obvious of details out of the way, let’s get into what makes this fast read really valuable. Seth is drawing a line in the sand and asking us, all of us, to take up the mantle of leadership. Don’t wait to have it bestowed, steal the throne for yourself. How? Create your own tribe, and have others follow you. That’s it. In a nutshell, that’s the very simple message of the book. Of course, simple should never be equated automatically with easy, and in this case, being a genuine leader and creating a meaningful and devoted tribe, while simple, is definitely not easy. It takes time, commitment, passion, determination, expertise, skill, desire, devotion and most important of all, faith. 

Tribes shares many vignettes of individuals that bucked the trend and changed the world. Some of the trend buckers are famous (Jobs, 37 Signals, Nike) and many of them are people that most of us have never heard of. Their stories are still poignant, and the essence of who they are and what they do rings clear to the primary message of the book. Take an idea, work with it, believe in it, and seek others to help champion it. So simple, and so direct. Yet amazingly, so few people ever actually follow through with it, because there is risk, there is a chance of criticism, and frankly, it takes sustained passion and belief that you will prevail where others have not. It’s a risky doctrine… but it’s also an extremely fun doctrine!

The key takeaway (and one that Set puts into other books, and I think is the central theme of his overall message) is that the “tried and true” of olden days is getting pummeled in a world where large established factories and processes are quickly becoming disrupted and obsoleted. Ten years ago, to publish a book, you needed a publisher willing to take a risk on your book. Today, Leanpub and Lulu (and other organizations) make it possible for anyone to write or publish). In that kind of a word, what sustains and grows is not the tried and true, just for the virtue of it being tried and true, but the remarkable, the unique, the interesting, and the desirable. Is everyone going to come up with the “next great idea”? Probably not, but all of us have good ideas, and many of those ideas do not require a major investment or system to enact. Leadership isn’t necessarily creating a startup and becoming rich (though that is certainly fun and awesome when it happens). Leadership can be based in your own group, and being willing to try a new technique or approach, or even trimming away wasteful processes that are slowing a team down. It could be a blogger who writes about an industry that they are passionate about, and how they hope to make changes in that industry.

What stands in the way? Mostly fear. Fear of being branded a heretic. Fear of upsetting the status quo. Fear of being criticized, or laughed at, or mocked. They could possibly be fired, or ostracized, or “burned at the stake” (hopefully metaphorically, though yes, there was a time when that was a real fate for “heretics”). Those fears are all real. They could happen. Seth makes a compelling point that they likely will not, and that the simple fact is that we are being surrounded by heretics, because the cost of being one is going way down in our modern world. With that in mind, why not be a heretic? Be a leader, exhibit some faith, and take that leap. Not only will you likely not suffer slings and arrows, you may just inspire many others to follow your lead, and you may encourage them to do likewise.

Bottom Line: 


This isn’t a long book. It’s not a thorough book. It doesn’t have a lot of specifics. It doesn’t have a lot of examples. It makes the maddening point that there is no map to follow, but if you have some passion, some drive, an idea or two, and a belief that they will come to fruition, the odds of you finding people willing to help you push that idea forward, by forming a tribe around that idea, is pretty high. It’s also likely to be small at first. It may may always remain small, and that is totally OK. In fact, most of us belong to hundreds, if not thousands, of tribes at any given time. In many of those tribes, we are followers. In many of them, we are leaders. The most likely thing is that those of us who are leaders of tribes don’t realize it. Seth gives us some motivation to help figure that out, and then do something with it. For that, I think Tribes succeeds.

Tuesday, January 21, 2014

Questioning My Expertise

It's been while since I've posted anything. Again, that's both on purpose and not. I felt the need to disconnect for a bit and take care of some other things in and around my life that have, frankly, suffered a bit from neglect (my home, my yard, some much needed family time, and a project that is going to be the main part of this entry today).

Back in November, I posted about a "tank crash" that I suffered, one in which a fish tank that, in some way, shape or form I'd been running, mostly non stop, for close two decades came to an ignominious end. From a full community to zero survivors, and from zero survivors to a tenuous hold on  new community and eco-system. In this process, I have had to come face to face with the fact that everything I thought I knew, and all of the methods and techniques that just "worked" for me basically just stopped working.

What does a complete eco-system crash entail? What does one do when an entire world they were maintaining comes crumbling down? Yeah, this may be a bit over-dramatic, but trust me, I've prided myself on keeping fish alive and thriving for over a decade. To have them all die in a short period of time, even after taking major precautions and performing heavy and expensive interventions, has been very frustrating. The situation reached a point where a "baptism by fire" were necessary. Well, OK, not really, but a cleansing of Clorox, high heat and desiccation was very much involved. For those who have ever wondered what a complete purge of a system involves, it basically goes something like this:

- deconstruct everything in the tank. That includes removing all filter media, all rock work, any real or plastic plants, any decorative hiding spaces, and all of the tanks substrate (in my case, ranging from fine sand to pea sized gravel.

- drain all of the water: all 65 gallons worth. Needless to say, the front and back yard plants and trees yard received a lot of attention that day.

- take all of the filter media out of the various filters (sponges, ceramic tubes, plastic inlet/outlet tubes, carbon bags, phosphate absorbers, etc.) and put them in the dishwasher (having run it empty with no soap prior to this for preparation. the dishwasher, acted as a pseudo-autoclave for sterilization purposes).

- take all of the decorative rock work, filter media baskets, decorative materials, etc. and perform the same process in the dishwasher

- gather all sand and gravel into a bucket and boil in various pots until everything had been heated for at least 15 minutes in boiling water.

- soak the boiled rock in a 6.25% Chlorine bleach solution (that's a cup of bleach to a gallon of hot water) for 72 hours, then rinse with clean water and let soak in water treated with tap water conditioner (to remove any remaining chlorine).

-wipe down the tank, inside and out, with the same 6.25% solution and let dry. Rinse and fill with fresh water and tap water conditioner.

- put everything back into the tank and test over several days to make sure any chlorine or other trace metals were all gone.

- add a biological filter agent to the tank and introduce a "cycling population". This may sound cruel, but it needs to be done, and a group of fish are the best to do this. For my purposes, a school of six Giant Danios were my literal "canaries in the coal mine". A month later, and they have been doing fine, as have four Boesmani Rainbows I've introduced since the tank has been cycled.

Sounds like everything is working great, huh? Well, not quite. This morning, two fish I recently purchased and was taking care of with the hope of introducing into the main tank after a quarantine showed many of the same hallmark symptoms of the disease I had just tried so hard to eradicate. WHY?!! What is going on here? Why am I seeing such large scale infestations when I wasn't seeing them before? Why was it happening again? I'd been using a quarantine tank. I'd been changing the water. I'd been feeding in very small amounts and monitoring them. All of the things that I had figured would be to their benefit, and yet, this morning, I found two dead red severums in my quarantine tank. AGGGHHHHH!!!!

Fortunately, after years of being a tester, I decided to stop cursing my bad luck and start thinking systematically about what could be the problem. First, my quarantine tank. It's six gallons. Not large, but then it's not really meant to be. It's only going to hold one or two fish at any given time, and then just for brief periods. The fish that inhabit the tank are all juvenies, and the tank is outfitted with filtration, aeration, light, heat and all the things necessary to keep them healthy, or so I thought. The rainbows had been through the quarantine process, and had suffered no ill effects. Why did these fish have such a different fate?

Part of it has to do with a different morphology. Rainbowfish are cyprinids, and as such they are highly active, but very efficient fish. They don't eat a lot, and they produce small amounts of waste compared to the red severums. Though the classic yarn of "an inch of fish per gallon" was maintained, the four rainbows produced lots less waste than the two similarly sized red severums. I had figured regular water changes, to the tune of a gallon every other day, would be sufficient to maintain good water quality. It made sense for the rainbows, and may have been overkill. For the severums, frankly, it may have been too little.

There was one other little piece of the puzzle that was introduced, and frankly, I hadn't even realized it. Since the small half bathroom has a sink and storage for everything I use in the aquarium hobby, it just made the most sense to set up the quarantine tank right there in the bathroom. Of course, there's also other activities that take place in a bathroom, and to help keep said room "pleasant", we'd plugged in a wall vaporizing "air freshener, the kind that heats up essential oils to make a pleasant smell for the small room. Having used it for so long, I hadn't even really given it much of a thought, except to notice that, a couple of days ago, there was a stronger smell. The reason? Christina had changed the small bulb and put in a new scent, one that was more noticeable. For people, not such a big deal. For the respiration of fish? Think of how it feels to breathe in super concentrated pine oil when you clean something. Now imagine you can't escape it. Yep, dare I say it, that might have had something to do with it. Were the fish in a more open room, or had there been stronger venting of the room, then it wouldn't have been an issue, but in such an enclosed space, I'm guessing that exposure could prove to be lethal.

So here I am, an empty quarantine tank, two casualties, and a bit of frustration. However, I can take some solace in one aspect of this... the system worked as it was designed. While my fish looked to have succumbed to a common ailment (one that, realistically speaking, all fish have) the environment I had unwitting set up, for the best of purposes, caused the condition to manifest, and to become fatal to them. On the positive side, if there can truly be one, is that I didn't release these fish into the main tank, where the infection they carried would have spread to the other fish and started the whole process over again.

It's easy to start second guessing yourself when something that works well for so long breaks down. When that happens, it seems like everything you do from that point on no longer yields success.  It's enough to make one want to throw in the towel completely, but I also realize that systems, especially ecological ones, are complex. It takes time to reach a new equilibrium, and to get new communities to thrive again. There will invariably be mis-steps, second guessing, and a loss of overall confidence in one's efforts. It's especially frustrating when lives are on the line, even if they may just be "fish that only cost a few bucks".  I feel it acutely every time an animal dies in my care, because it forces me to think and ask "what could I have done differently? How could I have been better prepared for this?" Additionally, having a space that I can use to isolate and care for  individuals in a manner that will help them thrive, as well as firewall them from others should something go wrong when we bring them home, is now more important than ever. That first quarantine is vital. Making that environment as healthy as humanly possible is critical, and little things that we often don't think about can have huge repercussions. Here's hoping I've learned enough these past few weeks that future additions will be able to live and thrive, if not disease free, then with as little chance of having those issues coming to the surface as humanly possible.

Thursday, January 2, 2014

Talent Minus Hard Work Equals ???

During the week between Christmas and New Year's Day that I spent at home, I mostly disengaged from other things. I spent the time spending time with my family, doing little hops to places around the Bay Area with them, visiting with friends, and doing some much neglected and necessary work around my house. I also had a chance to see something up close that I hadn't paid as much attention to previously.

My elder daughter is an artist, i.e. she draws, paints, and is now learning to manipulate pixels, etc. She's been doing art in some way, shape or form since she was about 4 years old. At first I thought her drawings were cute, then spirited, then showing promise, to "oh my gosh where in the world did this come from". Now, granted, I am biased. I'm her Dad. Her work will probably always be amazing to me simply because she's my daughter, but she's asked me to be as objective as I can be. To tell her areas she can improve, to make her craft better. I told her that I'd be hard pressed to help her with the mechanics of her work, but I can offer a personal opinion as to how she draws and what I see.

While I was home this week, I stayed up a little later than normal, and spent time doing other things, to the point that I noticed a rhythm in my house I hadn't really been familiar with. As the lights would get turned off, one lone light in the upper part of the house stayed on, and on, and on. A few times, early in the morning, I would walk upstairs, and there I would see my daughter, working on something, sometimes at 4:00 a.m. or later. She'd been at it all night. I thought this was an occasional thing. Nope, more nights than not, this is how she works. What does she do? She posts on Instagram, and asks her followers who would like to be drawn. She collects the responses, and then she picks at random from the responses. From there, she goes into her "work mode", and she makes a sketch of the person based on a photograph or a description and then posts it. Her Instagram account is filled with these pictures, and the word of mouth is spreading her reputation far and wide. I laughed when I told her that her Instagram followers outnumber my Twitter and Facebook followers combined, and well she should.

My daughter reminded me of something very important with this. She has talent, no question, but her talent has been honed by years of putting in late nights and spending hours and hours each day to draw, paint, or sculpt. She understands the value of continuous practice, and she uses external motivators (other people and their requests) to try out new ideas and mediums. She's public about what she does, and she is gracious with the feedback she receives. What's more, she applies the feedback she receives, and puts into practice new insights, ideas, and enjoys both the process, and the notoriety, that comes with it.

This dovetails nicely into my post from yesterday. Resolve only gets us so far. To really make headway on something, we need to commit to it, we need to do the work necessary to improve, and we need to court opportunities to get more practice and do better. Tying our reputations to the work we do, and making our reputations public is something I and her find valuable, but we realize that's not going to work for everyone. Regardless of the methods, talent is great, but it will only take you so far. Hard work, diligent practice, and engagement with others to the point of developing reputation will trump natural talent if natural talent is all that's on display. To carry talent to its fullest, we've all got to practice, and practice regularly.

Wednesday, January 1, 2014

On Resolutions, Commitments and How Nature Abhors A Vacuum

First off, Happy New Year, everyone. 2014 is here, and with it, a lot of talk from people about the resolutions they are going to make. It may surprise you all to know that I don't make resolutions for the New Year, or for any year. My logic is that, if an idea, goal or commitment is worthwhile, you will start it when you start it, and will keep to it if it actually matters to you. If it doesn't, you won't keep to it. Really, that's OK. There's lots of resolutions we make that are most often poorly thought out, don't address what we really want to do or need to do, or we don't give ourselves realistic parameters to actually meet them.

I'm much more persuaded, and have a much better success record, when I make commitments. The more specific the commitment, the better I will likely be about actually achieving it. An example, saying I want to write more for my blog is a weak commitment. It's unbounded, unfocused, and it doesn't really meet any objective. Write more of what? For whom? For what purpose? Chances are, if I can't answer that, then I am far less likely to achieve anything that will have lasting results. Now, were I to say "I want to write more about technical testing, the aspects that make up how I can do that, and ways to interact with technical topics. I want to focus on network testing, open source tools, programming languages and automated processes. I want to incorporate those ideas into my everyday testing repertoire, and share the ideas that work, and those that don't work"... now I've got something I can hang some real work and energy on, something I can commit to, and perhaps others might be excited about that as well.

What's more, as I have seen over the past year, when I stopped producing podcasts as a regular thing, it's not like I suddenly had hours of "free time". Instead, other things climbed in to take center stage and demand attention. If I gave them attention, they became regular parts of my routine. If I didn't give them attention, I rarely got around to them later on, because the items that I had given attention to flourished and (surprise) demanded even more attention. It's with this in mind that I am going to start, here and now, with Noah Sussman's challenge to write a book on becoming a more technical tester. Well, Noah's actually writing the book. He's already made the Table of Contents. My challenge to myself is to apply the table of contents to my own reality, and to set up a case study that's relevant to my world. I intimated that I was going to do this last year when I finished up the "99 Ways" workshop series. The time to make good on it is now. Please feel free to follow along on the journey to see where it leads me (and perhaps you, too ;) ).