Monday, April 30, 2012

WOTA (Write Once, Test Anywhere)

In some ways, this has become the Holy Grail of my automated testing existence.

In the environment that I am currently testing, I have four distinct places where I can run our code. On a development system, on a demo machine, on a staging environment, and in production. I used to maintain four different systems, and those four different systems would float and require tweaking. In an attempt to try to get some sanity and reduce wasted effort, I decided that a better, more Don't Repeat Yourself (DRY) approach was to figure out how to move my scripts to a system that allowed me to create one set of scripts, and then optimize my environments and my test data so that those scripts could be run anywhere. This is the core behind my approach of Write Once, Test Anywhere (WOTA).

Now, when I say "my approach", I certainly don't mean that I'm the first one to think to do this, not by a long shot. I'm also having to make interesting tweaks to various config files so that I can effectively do this. Unlike the development team's tests, which really only have to focus on one environment to verify that their tests work (note, that's not a dig, it's a reality), mine have to work effectively on four different environments. This also helps in the sense that it allows me to see if Acceptance Tests and the methods used to write those tests (and their surrounding features) actually carry through as we apply them to consistently more complex systems. As we get closer to production, we abstract away from running tests on a dedicated workstation and a dedicated machine, and instead start calling on multiple machines that are structured in a cluster, with caching, external and distributed databases, load balancing, etc. The test themselves don't change, but often, we see different behavior with the same tests depending on the system we are testing.

Unlike the unit tests development uses, my scripts minimize specific JavaScript calls that can be made underneath the presentation layer. I use them if I must, but I try to avoid them so that I can focus on a more behavioral approach to tests, and checking to see if what I see would be similar to what a customer would see as we walk up the chain of environments. Keeping the scenario pool relatively simple, reusing accounts where possible, and using test data and persona criteria consistently on each system helps alert me when things don't appear correctly or if we have some investigation to do on different systems. The hope is that the tweaking necessary on production is minimal to non-existent; if I've done my job right, then tests really should just run on production and there should be little in the way of hiccups.

If this piece is a blinding flash of the obvious, well, it took me a bit of time to figure out a good way to do this. If you are still testing with different bases for each environment, seriously, consider looking at ways you can implement WOTA in your tests.

Thursday, April 26, 2012

Weekend Testing on May 5th: Something Different

Most of the time, when I make announcements about Weekend Testing, I only do it the week of the event, with little lead up time, and not a lot of publicity. This is because of a number of reasons, but the main reason is to keep the group to a manageable size. Often, if we get more than 25 attendees, it's hard to communicate and threads and participants get lost in the shuffle.

This time, I am making an exception, because we are doing something with a need for a little bit of prep and perhaps a bigger crowd to do it justice.

On Saturday, May 5th, 2012 at 10:00 a.m. Pacific, we will be testing some new functionality being rolled out on Wikipedia (yep, *that* Wikipedia). Cross browser testing will be a big part of it, so in this case, the more the merrier really does apply. Also, we are in the process of creating a number of smaller charters so that several testers can branch out and try different things based on their interest. Those are being posted to the wiki (and will be shared as soon as Chris McMahon says it's OK to share it :) ).

For those not familiar with how Weekend Testing works, here's the refresher:

1. Add “weekendtestersamericas” to your Skype contacts if you haven’t already.


2. Fifteen minutes prior to the start of the session, please message “weekendtestingamericas” and ask to be added to the chat session. Once we see you, we will add you to the session.

For more details, contact WTAmericas@gmail.com. Hope to see you there!

Tuesday, April 24, 2012

Seeing Things The Way We Are: A "Recycled Scoutmaster's Minute"

I wrote this a few years back around the time my son became actively involved in the Order of the Arrow, a service organization that I ma also part of, with ties to Native American traditions as well as to Scouting ideals. Since I've been recalling some of my previous Scoutmaster Minutes, I thought this one would be good to include as well. I think some of the comments are also applicable to testers. See if you agree :).

Seeing Things the Way We Are (October, 2008) 

 This weekend, my son and I will be heading up to the Santa Cruz Mountains to participate in the Ordeal Weekend for our Order of the Arrow Lodge. It’s an event we have done together twice, and now this will be our third time together. Last year, he went through for himself to receive Ordeal membership. Last spring, he served as an Elangomat (meaning Friend or Guide) to others going through the Ordeal for the first time. While this was happening, I was in another part of the camp going through the process to receive the Vigil Honor (and no, I’ll not tell what that process is, if you want to know, join and get there yourself ;) ).

This weekend, Nick will be acting as an Elangomat again, and he has the chance to seal his membership in Order of the Arrow as a Brotherhood member. What's more, he will be an Elangomat for fellow members of his Troop, so this time, he will actually be leading his own friends through the process.

 So what does the title of this post have to do with the preceding paragraph? It always interests me that we have opportunities where we can get away from it all, think, ponder, pray, meditate, and learn a little bit more about ourselves and where we fit into the world. These actions allow us to open our eyes just a little bit more, and they let us see a little more clearly what we as people need to do.

It’s been a year since my son was elected to be an Ordeal candidate. In that time, he has learned a bit more about what it means to serve and be part of a bigger group, and to contribute to the success of that group. I’m proud of him and what he has been able to do in a short time. By contrast, my own involvement over the last year has changed somewhat with the receiving of the Vigil Honor. With it comes a greater expectation, and through that expectation, I’ve determined that I need to be more aware and alert to the things that I need to do and the example I need to set.

 As an active member of my church, I am often instructed and counseled to read my scriptures daily. Oftentimes, I have had to ask myself “why am I being asked to rehash things I’ve already read a bunch of times before?” I’ve come to see that it is because there really is no one world, one absolute reality, but billions of them, and each reality is informed by the viewpoint and the vision of the individual living that reality. Unlike a stone crag that juts out into the ocean, impervious to all that buffet it, human beings are really very small boats that have very lightweight anchors. We get blown about all over the place. Likewise, we are also very swift and maneuverable; we can change course very easily and maneuver very quickly and with great agility in even the most treacherous of areas. Thus the words that we are taught, and the counsel that we always seek those words, is inspired because of the fact that we are such lightweight and agile but easily blown about vessels. We need to be reminded where our home port is, and we need to be reminded of what the windows on the bridge are supposed to see. 

What's more, there is a lot of our world that is hidden from us because we are not ready, willing or able to see it for what it is. Those moments of clarity take time to develop, and they often come about because we've discovered that something we thought worked well for us really doesn't. If we do not question and ponder the things we read, but just accept them at face value, we also stagnate, and may not even realize that the world has changed under our feet, and we were too slow to observe and realize that the world had changed.

 This weekend, a new batch of boys will be coming up to find out a little bit more about who they are and how they see the world. Here’s hoping it will be a good experience for them, and for us as well.

Monday, April 23, 2012

The Pen Game: Software Testing Encapsulated


While I was at STAR East in Orlando last week, I had a chance to spend a fair amount of time with fellow testers outside of track sessions or after conference hours. In fact, I don't think I got to bed before 2:00 a.m. Eastern once the entire time I was there. With an admission like that, you'd be forgiven for thinking that this was a hard partying bunch, but that isn't the case. In fact, much of our time was spent talking philosophy, the variety of practices out there that could be considered "good" and improving the ways that we talk about testing. It was during this session that I was introduced to "The Pen Test".

The Pen Test is a simple game, where a "presenter" or product owner holds up a pen and repeats some questions. There are only two possible answers; Yes or No. The "tester's" goal is to figure out what makes the answer Yes, and what makes the answer No.

If you think I'm going to give you the answer, you don't know me very well (LOL!). If you want to play the game, find someone who knows it and ask them to walk you through it. The game itself isn't the point of the post.

What I found interesting about the game was the fact that it helped me explain software testing and the actions we subconsciously perform in a much easier way. We start out with a program (being presented the pen). We are shown behavior (the words and actions to display the pen), and based on that behavior, we need to determine what "state" the program is in. Sometimes we can guess, and do very well. but there's a danger when we guess and get it right many times. We create a mental model that may not be accurate, and when something goes wrong, we are at a loss as to why.

I had this experience when this was presented to me. Not by skill, but by luck, I was able to guess the correct answer nine straight time. It was on the tenth try that I got it wrong, and then had to regroup and figure out again what the criteria for Yes/No could be.

For this game to be successful, the testers need to weed out as many non essential aspects of the game as they can. There are many aspects that we can see, and these aspects may or may not have any bearing on the trigger that makes the answer Yes or No.  The effective tester uses as many tools at their disposal to weed out as many of these options as possible. We create a hypothesis, and then we test the hypothesis. If it holds up, we continue pressing, but if it doesn't, we should discard the model we have created (or at least the assumptions that underlie it) and try something different. Often, through this process, we are able to notice subtle differences, or eliminate things entirely from consideration.

During the game, the product owner is able to answer any of our questions, provided the answer isn't "write down what you say" or "tell me the answer". This is much like a black box level of testing.  We have to determine the answers based on the behavior of the application. Through this process, we try out a number of different heuristics. They may work, they may not, but each time we hit a dead end, we have another variable that we can remove, and that, over time, gets us  closer to a solution.

I had several conversations with people over the course of the week, and tried this out with a number of different people. The ability to have a conversation about what worked and what didn't helped greatly in explaining the way that we make assumptions, try out models and test their ability to be effective, discard theories that don't work, and keep honing our process until we nail down the correct item(s) that make for a Yes/No situation. The challenge, of course is that once a game gets well known, then it loses its effectiveness, because we focus in on the answer. This game, at least for right now, requires a lot of questions and inquiry to answer, so I think it's worth using as an exercise for the time being. If it gets too well known, I'll look for others. That's one thing I know that I never have to worry about; there's a lot of games that can be applied to software testing; so many that any given tester is unlikely to have seen or worked through all of them.

Friday, April 20, 2012

STAREAST Day 2 Recap

I have to admit, it's a challenge to separate the fact that I am on East Coast time from the realities of waking up at 7:00 AM each morning, which is actually 4:00 AM where I'm from. I do that from time to time, but not with such late evenings. Still it's been a great deal of fun to hang out with such a great group of people and to have such wonderful, serendipitous moments.

The day started out with breakfast and a keynote address from Dorothy Graham of Grove Consulting about "What Managers Think They Know about Test Automation–But Don't". Dorothy points out that often, Management is looking for the following aspects to happen when they talk about test automation. They believe:
  • automation can test everything
  • automation will find all the bugs
  • with automation we will get test coverage of 100%
  • more automation means more confidence
  • run tests overnight and weekends
  • automation will prevent human error
The fact is, there's a lot more that goes on in these areas than what management hopes to have happen. Yes, automation can save time, but it will only save time after the automation is developed, and it won't save time if the development of the automation is close to the amount of time it takes to test things manually. Trust me on this, it takes time to automate tests well. The architecture of well-running automated tests is key. making that architecture takes time. Poor architecture often leads to poor automated testing, and frequently to abandoned automated testing. The relationship with the development team is also critical. If we work with them, we can develop solid tests that can be useful for the long term. If we develop our tests in isolation, we may miss a lot of things, but more importantly, we may find we are working at cross purposes and not realize it.


James Bach gave an excellent talk on "The Dirty Secret of Formal Testing". What made this talk interesting was the fact that he was describing testing in highly regulated markets, dealing with a product that was very life or death (a medical device to help aid patients with heart problems). These environments are extremely formal, but the dirty secret of these environments is that informal testing goes on all the time. It has to. Otherwise there's no way to learn about what the product actually does. As described by James, formal testing is any testing that must be done a specific way or must check specific facts. Requirements as described by James (and I like this description) are ideas at the intersection between what we want, what we can have, and how we MIGHT decide that we got it. In complex and life-or-death systems, we have to test to find out if that has happened, and we have to test first to find out if we actually accomplished those goals. Thus, contrary to popular belief and conventional wisdom, good formal testing MUST first begin with informal testing.

In addition to the variety of sessions, I have to point out that many times, the best discussions and best ideas happen between the sessions, or happen when we decide to engage in conversations with a single person or a handful of people, effectively creating our own session on the spot. I confess, I had a hidden motive with coming to STAREAST beyond giving my talk and learning some cool new ideas to bring home. My goal was to see what I could learn from many different people about how to teach software testing, and make it fun, for 16-24 year olds. More to the point, I needed to recruit helpers willing to do the work, and I was not disappointed. Instead of me looking for people to discuss this idea, word preceded me, so I had people coming up to me to ask if they could take part. Based on the interactions I had with many people this week, I got what I came for and plenty more. I had some terrific discussions with people from many industries as well as educators who have helped me narrow down and distill some of the ideas I have, and it's given me many new directions to consider. They may not all be possible, but I'm looking forward to trying them out.

STAREAST Virtual was a fun and interesting way to communicate with the audience of participants that were not specifically at the conference, and it was fun to have Matt Barcomb participate with me in a conversation about Testing Roles and Approaches to Paired Development and Testing. In addition, we also discussed many of the aspects of collaboration and ways to break down the unintentional walls that we build. In this way, we can set up our environments and approaches so that we can become more effective.

The main conference came to a close with the awarding of a number of awards, including those who found interesting bugs in the Test Lab, and something that made me smile greatly. It was announced to everyone just before the closing keynote that my paper and presentation were chosen as Best Paper for the conference :). For someone who gave his first official full hour-long track talk at any conference, well, yesterday, that was a wonderful thing to see happen (I rank it up there highly with "firsts" of my career :) ). I appreciate the award and those who felt I deserved it. Thank you very much!

The final keynote of the conference was given by Theresa Lanowitz of voke, inc., and it discussed "Testing Trends: Cloud, Virtualization, and Mobility". It gave a forecast of what we might see in the next two to three years, and I must say, things look exciting for testers, regardless of what the "test is dead" people are saying :). Numerous examples of performance problems, unintended consequences, security breaches and other issues of integration were explored, and they all pointed to the same thing. Testing needs to be performed, and it needs to be performed at a level that people can make solid and intelligent decisions about the ramifications and consequences. The simple fact is that there are no end of authentic problems to be tackled and the scope and the landscape are expanding. There will come a time when we will need more testers, not less. Get ready, folks :).

My thanks to so many people for the time they took to get to know me, speak with me, hang out in the hotel lounge or go out to dinner with me and talk about ideas and concepts that we all deal with and the solutions we hope to achieve. My thanks to Scott Barber, Claire Moss, David Gilbert, Matt Barcomb, Lee Copeland, James and Jon Bach, Lanette Creamer, Rachelle Sawal, Janet Gregory, Allison Wade, Matt Barcomb,  Randy Rice, Mirkaya Capellan, James Lyndsay, Michael Bolton, Zeger Van Hese, Griffin Jones, all the attendees of my talk about Weekend Testing, and everyone else that helped make this a memorable week.

'Til we meet again :).

Thursday, April 19, 2012

Do Or Do Not - A "Recycled Scoutmaster's Minute

This will be my last of these for awhile. With this I have exhausted the ones'  that I wrote and posted for the blog. While I will definitely revisit this idea again later, they will not be "recycled Scoutmaster's Minutes", but up to date ones, I hope. Again, this one came out of my pondering some things I was reading by Larry Winget (I went on a Winget tear in 2009 and read five of his bocks in short order.


I think it's important to realize that, when we talk to scouts (or testers), these are kids that can generally take some "tough love", and actually thrive on it. That was an important lesson to learn. Make high expectations and you should not be surprised when people meet those expectations. Set low expectations, and the same thing applies :).

DO OR DO NOT!


While I was reading Larry Winget's book "It's Called Work for a Reason", I found a statement that I felt was very applicable and something common enough that everyone would be familiar with it.

"WHEN SOMEONE SAYS THEY WILL TRY, BET YOUR MONEY IT WON'T HAPPEN"

Whenever someone tells you that they will try something, they have given themselves an out. If they don't do what they said they would do, then they can always come back and say "oh well, at least I tried".

Personally, I would encourage every boy to purge the word "try" from their vocabulary. Try does not have the sense of commitment that the word "do" has. If I were to say to a Scout "I will try to hold a board of review for you after you have finished all of your requirements" they would be rightly upset with me if I did not follow through, but still, I can always fall back on those tired words "well, I tried" (notice how similar the words “Tired” and “Tried” are? I don’t think that’s an accident :) ).

There is a scene in the Star Wars film "The Empire Strikes Back" that always resonated with me. It's where the character of Yoda is speaking with Luke Skywalker and Yoda asked Luke to do a particularly hard task. When Luke answers that he will try to do it, Yoda's answer is direct... "DO, or DO NOT... there is no TRY!"

The reason this word is so popular is that we as people want to shield ourselves from failure. If we say that we will try to do something, and it didn’t happen, we didn't really fail, right? We just tried it and decided it wasn't for us. It's comforting to say it that way, but often it rings hollow. It is better to say "I intended to do this, but I found that I ran out of time and could not do it" or "I did the thing I was asked to do, and discovered that I was not prepared or not physically able to accomplish the goal". There's a lot more meat to those words, isn't there? When we do something, there is not always the guarantee of success. We may succeed, we may fail, but no matter what, we DO SOMETHING ABOUT THE SITUATION. If we succeed, then we enjoy the accomplishment. If we fail at it, we learn why we failed, adjust our approach towards the goal, and do it again. It's my belief that those who keep doing, even when they fail, will eventually succeed... or they will determine that the task at hand is one that they wish to no longer do and they will set it aside. Either way, they have DONE something about it!

So boys, when you are asked by life to accomplish goals and to do the work that you need to do, commit to DO IT, or to NOT DO IT. Either is fine, but no more trying. History honors the doers of the world. Be a doer :).

STAR East Day 1 Recap


Yesterday was the first official day of the STAREAST Conference (not counting training and tutorial days ). I would have had this posted earlier, but I spent an entirely way too late night with a bunch of madcap testers... eh, what can you do ;)? .

We started out the day with breakfast, getting registered, and me getting my name tag with a beautiful little attachment. It felt really good to see the tag that says "Speaker" on the bottom :).

First off, right out the gate, we heard from Keith Klain from Barclays and the details of how he has been developing the Global Test Center for Barclays Capital, and the areas that they decided to work on to grow their testers. I found this to be very encouraging, in that a large corporation was actually looking at getting rid of metrics, bug counts and numbers that would seem to be logical, but turn out to not be effective. The talk focused on "Leading Cultural Change in a Community of Testers", and this made the point that, to make such a change, there had to be a large scale effort happening at the top level of the organization. One of the quotes that lead off this discussion was from Dwight D. Eisenhower; "“Leadership is the art of getting someone else to do something you want done - because he wants to do it.”

They focused on developing honesty, integrity and accountability in their testers. In return, they opted to jettison needless Test Maturity Models, Metrics Programs, and arbitrary "Career Development" approaches that are based on numbers, but not on people. Some of the principles they espouse are:

• People start to ignore testing when it is no longer relevant (in short, make sure the work you are doing is relevant) 
• Being responsible sometimes means rocking the boat (you have to drive the change, and sometimes that change will upset people)
• No one has the market cornered on good ideas (you'd be surprised who can come up with an amazing insight or twenty ;) )
• Never stop asking why – question everything (making assumptions or accepting things too readily can lead to blind spots)
• Invest 80% of your energy in your top 20%... (daring, but it makes sense; the people who perform are going to be the ones open to learning and growing)
• Leadership = Simplification (actually, leadership often means clearing obstacles and then getting out of the way)
• Don’t take it personally (you may be right, you may be wrong, you may be overruled, just remember it's not about you, it's about the product, services and customers)
• Think first – then do! 

I liked the fact that Keith included the technique about Start - Stop - Continue and what they would do (it warmed my Wood Badge heart to see that ;) ):

Stop 
Thinking that the value of the test team is in anyone else's hands and pretending “maturity” driven test metrics will make improvements

Start 
Telling the team exactly what's expected of them, supported by systematic training of testing skills, test reporting and business alignment

Continue
Driving out fear of failure by creating an environment that enables innovation and rewards collaboration through strategic objectives and constant feedback

Overall, this was a nice rallying cry to the troops. It was nice to see a senior member of the team showing focus on testers and testing, and point out that it's not just that we test, but that we develop meaningful skills and add meaningful value to the team.

Michael Bolton followed up with the 2nd Keynote of the day, titled "Evaluating Testing: The Qualitative Way". This was an interesting talk comparing the physical sciences to the social sciences, and the fact that quantitative examination with metrics and numbers makes sense when we are dealing with things like machining or physics, or other physical properties. Software isn't a true physical science,  it's actually more of a social science, because we deal with the interactions and the feelings of people to inform our tests. Michael quoted Dr. Cem Kaner's talk in which he stated that often times we don't get complete and fully fleshed out answers, but that we often get "partial answers that might be useful". Thus, we need to use different tools to evaluate software testing, or at least we need to focus on different ways of performing our testing that will allow for this social sciences level approach.

The key idea behind Qualitative Analysis is that Qualitative approaches are based on observation, making distinctions of categorization and classification as well as description and narration. By contrast, Quantitative approaches tend to assume that categorizations are accurate, and largely ignore associations with the object of observation. The key takeaway from this talk was that we needed to add the human element to our testing, and to do that well, we need to step away from the numbers and the left brain approaches, and get in and focus on what real people actually want and do (note: they are not always what we think they are). Great stuff from Michael as always (I've heard him do keynotes twice, and so far, I think he's two for two :) ).

Janet Gregory did a track session on Agile Testing practices, and since I had a chance to interact with Janet at the POST peer conference in Calgary, Alberta back in March, I was curious to hear a longer version of her ideas and approaches related to Agile Testing. One of the key areas that Janet wanted to point out was that, unlike in traditional development organizations, where roles are separate and unique, in Agile teams, roles are inter-related and overlap. It's not uncommon to expect that a Programmer, Domain Expert and Tester would overlap, and in many cases, reside in the same person. It's not necessary that everyone be capable of having that level of ability to cover all of those areas, as long as the organization can cover it with that level of overlap.

What this does tell us is that we are reaching a point of convergence for may of the people on these teams, so that a Tester is not just a tester. Key takeaways from this session were on how to implement Acceptance Test Driven Development and Behavior Driven Development. Put your tests first and then code so you know what's being covered. In addition, testers are NOT responsible for quality (the whole team is). Programmers do not code alone (everyone helps them understand what to code), and that the team needs the “right” roles and people (meaning we as testers may need to up our game to take part in these areas. 

The next session was a talk about Weekend Testing and it was delivered by, well, me :). I have written extensively about this topic so I won't recap the entire talk, but I will point out that the goal of this session was to encourage people to learn more about Weekend Testing and, in addition, learn how to bring the approach we used back to their own test teams. By all accounts I think it went really well. One funny comment. Apparently, because of my energy levels, people thought I was a consultant selling my services. They missed the several times I mentioned that Weekend Testing was a free endeavor. They kept waiting for me to say how much I charged, and still didn't quite believe me when I finished the talk and didn't say anything about charging for my services. Maybe giving away the SideReel swag was to blame, I don't know.


I closed out the day by conducting an interview with Lanette Creamer for the STAREAST Virtual conference. I don't know if that interview was just "live" and sent out over the ether, or if it was recorded and will be available later. Once I know, I'll let you know.

The official day closed out with the ever popular "Lightning Strikes the Keynote" which is where 8-9 participants present their best ideas or most compelling content, one after another, and are only allowed 5 minutes each. It certainly kept the energy level high to do that, and made for an enjoyable series of talks. 

While the sessions are definitely valuable, I have to say that I have found the best interactions to come with just talking with the other participants at times outside of sessions. Hey, imagine that, going to a conference to Confer, what a concept :). I've been greatly enjoying my discussions with Matt Barcomb of Lean Dog Consulting. We've been able to discuss a lot of aspects of challenges that Lone testers face, and what we can do to better interact with our teams. Oh yes, there will be blog posts on this forthcoming :).


Testers and tester games go together like, well, pick you favorite comparison, but it's inevitable. I had the chance to work with James Bach, Griffin Jones and Michael Bolton on a few different challenges. Some were frustrating, each made me question how I approached things, and I was quite happy when I figured them out (and walked a couple of other testers through the exercises and saw their different approaches). In the end, I went to bed "way too late" but fully energized from the interactions. Now it's time to see what Day 2 has in store. See you soon!

Wednesday, April 18, 2012

Never Tolerate Mediocrity - A "Recycled Scoutmaster's Minute"

Larry Winget was a common read for me in 2009. Because of that, Larry filtered into a lot of my Scoutmaster's Minutes. I think it's important for scouts (and frankly, everyone) to realize that the only way to "be great" at something is to strive to actually be great. That's not conceit, it's truth. The person that stops us from being great at something more times than anyone else is "us". We are often our own worst enemies, and if we let ourselves talk ourselves out of being awesome, well, we'll get what we settle for. It was good to see this one again, and I think it has just as much relevance today.


Never Tolerate Mediocrity

There are many phrases that we will hear throughout our lives, and many of them will be strong reminders to us as to what we can be. While I was reading Larry Winget's "It's Called Work for a Reason", I found that there was a central tenet that he would mention over and over again, and I want to share that tenet with my scouts today:

"NEVER TOLERATE MEDIOCRITY"

What is mediocrity? It is the quality of being ordinary as a consequence of being average and not outstanding. Now, does that sound like a cruel thing to say?

"Come on, Scoutmaster Mike, not everyone can be a super-star!"

Actually, that's both true and untrue at the same time. Not everyone can be the best in the world at any given thing, that's true. That's the whole reason for competitions to determine who is the best. In those aspects, there are a lot more people who don't stand on the winner's podium, but that's not what I mean. Instead, I am referring to people who choose to accept a lot in life that is less that what is theirs to earn and have. I think it's fair to say that none of us want to live our life to say at the end of it "there was nothing that this person did that would in any way distinguish themselves from all of the other people out there". I certainly don't want to live my life that way.

So how do we live lives that are not mediocre?

First, we dream big. Then, we set our sights on achieving those dreams. Remember, dreams are just dreams unless we actually put into action the work that is required to make those dreams happen. From there, make a commitment to yourself that you will only accept the best that you can do. Will your best be "THE BEST"?! Perhaps it will be. Often times, it won't be, but commit to doing it anyway!

Second, do not be afraid to fail. The only way we really know that we are learning and progressing is when we find that things that we try do not work, and then work those lessons into our plan and keep pressing forward.

Finally, make a commitment to never stop striving to be an example of THE BEST that YOU can be. Is it reasonable that a student should get a 4.0 every time? For some, that answer is yes. For others, it may not be realistic. Still, do all you can to strive for that number if it's important to you. If you give it your all and you come up with a 3.5, don't feel like you are a failure. Instead, celebrate the fact that you did YOUR best. If after doing so, you think you can do even better, go for it and try. It's quite possible you just might.

At the end of the day, our dreams are only measured by what we actually do to achieve them. If you commit to living your dreams, and actually working to fulfill them, do not be surprised if you lead a remarkable life, one that has no resemblance to mediocrity.

Tuesday, April 17, 2012

Book Review: Practical Malware Analysis: The Hands-On Guide to Dissecting Malicious Software

This is a topic that has greatly interested me, but from the perspective of a tester. On one side, I think the ability to reverse engineer malware is fascinating, but more to the point what I really want to be able to do is see how the tools described can actually be used to augment security testing.

Malware has become one of those topics that we often wring our hands about because we know it's a threat, we want to better comprehend it, but do we dare open ourselves up to the potential of doing something wrong and unleashing an unintended havoc on our machines or networks? Fortunately, Michael Sikorski & Andrew Honig's book "Practical Malware Analysis" helps to de-mystify this type of operation, and also make it understandable from a variety of perspectives. If you are a programmer, this will be very handy. Even if you aren't, there is a lot of good ideas and techniques in this book that you can use.

Practical Malware Analysis is structured with regular chapters describing the concepts, and each chapter ends with a series of labs. the answers to these labs take up nearly a third of the book. They consist of short answers for the specific questions as well as longer form answers that go into great detail to describe the steps and the methods used to test the files and provide analysis of what was found.

Part 1 starts out by explaining what Malware is and how developers and testers can get into the files and poke around using some basic and freely available tools. The first part of the book focuses on performing static analysis of files and looking inside them to understand what might be hiding in the files, along with ways o read the headers, strings and data hidden in the files. To get beyond the static analysis, the users are guided through setting up a virtual machine environment using VMWare. Dynamic Analysis is what happens when a piece of malware is actually run, and the determination is made that static analysis just won't cut it any longer. From here, we need to actually start poking at the malware and see what it will do in real time (yep, in that safe VM, using what's referred to as a "Malware Sandbox" consisting of a variety of free tools).

Part 2 of the book goes into greater detail regarding performing more advanced Static Analysis techniques, including dis-assembly, specifically dis-assembly on x86 Architecture processors, reverse engineering using tools such as IDA Pro (it gets its own chapter and shows up in the future labs), examining how to get a high level view of the code using code constructs, and how to use the tools to recognize when we are seeing those code constructs in assembly language. since Windows is the target of choice for most malware, Chapter 7 goes into specifics about how to recognize and work with the Windows API and use the registry editor to analyze registry code. Processes, threads, and network functionality are also primary considerations.

Part 3 brings us into the realm of  more advanced Dynamic Analysis. Starting with debugging at both language and assembler levels, we see how to step through programs a piece at a time so we can see what is happening or change the way the code executes. Dedicated chapters to OllyDbg (a user level debugger) and WinDbg (a kernel level debugger) show how to do these steps with these specific tools.

Part 4 focuses on the way that malware behaves and what sets a program apart as being malware.  It examines a variety of back doors in the system and ways the program can steal credentials, and how it covers its tracks so as to not be easily detected. Chapter 12 focuses on how Malware starts itself, and the variety of methods it uses, including process replacement and DLL manipulation so that it looks like any other innocuous running process. Chapter 13 goes into a variety of methods of encoding and decryption, while Chapter 14 covers ways in which the network is utilized to to take advantage of exploits.

Part 5 is Anti-Reverse-Engineering, and goes into the steps that malware developers go to to prevent their programs from being found or being cracked,  Obscuring flow control, confusing dis-assemblers, prevent debugging, performing steps to use virtual machine detection and change the behavior or not run at all, and  using methods to compress or encrypt files so that they are packed away until a given time or condition are all part of the toolkit of the malware developer to throw the malware examiner/tester off their tracks. we have to realize that while we are trying to outsmart the malware developers, they are trying to outsmart us as well.

Part 6 is a grab back of special topics including shell code analysis, special considerations for C++ code and how it differs from C,  and some of the special challenges that the 64 bit processors and code base face and will face in the future.

The book rounds out with there appendices; describing important windows functions, a grab bag of tools that can be used fr malware analysis, and the biggest single section of the book, which is the answers and analysis to all of the labs provided.

This is a big book, and it covers a lot of ground. It is of course geared towards software developers, but there's a tremendous wealth of information for the software tester, too. While we may not spend as much time de-compiling or reverse engineering the code, having the ability to see the techniques that malware developers use, and the techniques that we can use to ferret out and stop them is enlightening. This book focuses on Windows and Windows exploits, and it's meant to be an introductory primer. It's not the be all and end off of malware analysis, but for many of us who are not specifically trying to be security experts, it's a heck of an introduction. 

More than being a good introduction, this is a "get your hands dirty and do so without putting your self or your neighbors in peril" book. This is a dicey topic, and its one where, in the wrong hands, or handled carelessly, you can do a lot of damage to your self and to others, so take the time to set up, wall off, and sand box as much as you can. Proceed with caution, but if you do proceed, you couldn't ask for a better tour guide.

What Can A Monkey Braid Teach Us? A "Recycled Scoutmaster's Minute"


This is another of the "recycled Scoutmaster Minutes" from my not quite to be Scouting blog. Again, I want to capture them and perhaps give them second life, since the things that motivate kids quite often motivate adults to, though we might not be as willing to admit it ;).

What Can A Monkey Braid Teach Us?

First, a "Monkey Braid" is known by a number of different names, such as monkey chain, single trumpet braid, single bugle braid, chain braid, chain stitch, crochet stitch, etc.. The name you use is dependent on who you talk to and where you apply it. In the Scouting world, it's a way to take a piece of rope and shorten it without actually cutting the rope, yet still use it without a lot of rope just lying around.

The first time you try to visualize or attempt to make a Monkey Braid, you may get frustrated. It's a hard "knot" to first explain, other than to say that you make an overhead loop, form a noose and tuck it through the loop, and then make another loop and tuck it through the previous loop. Keep repeating until you get to the desired length, and then tie off with a final overhand knot. If you need a little more length, just undo the end and pull to get the length you need, then loop it off again (see, I told you it could be confusing).

When we make our first Monkey Braid, we may have just one use for it. My first use was to shorten the headphone cable that I use with my portable MP3 player when I exercise; I was tired of it getting hung up on things, but didn't want to have a big tangle or ball of wire tied up. The braid took care of that. What I discovered after I made my first Monkey Braid was that it could be used in many other areas. I work in an environment that deals with a lot of cables (I'm a Software Tester in my work life). The Monkey Braid is a great way of dealing with strewn cables and getting them to be gathered up without tangling. So I started making Monkey Braids with a number of things (extension cords, network cables, cell phone charger, DS charger, etc.). Because I started to see other areas where this skill was useful, I used it more often. Now making a Monkey Braid is second nature to me, and I use it to do a lot of different things.

In Scouting, we are asked many times to learn certain skills. Some of them seem to be very specific, and we wonder "Why am I learning how to do this? Where is this going to ever be important in my life?" The answer is, in lots of ways and in ways you may never expect. While Scouting focuses on Advancement and earning badges, those badges really are just symbols. The real award is the skills that you learn, practice and master, because those skills will then be something you can call on again and again. Will my life have been radically altered if I didn't learn how to make a Monkey Braid? Probably not, but now that I know how to do it, I can see many places where I can use it and it makes things easier to handle and deal with (especially ropes, cables and extension cords).

The point is, the Scouting skills we work on have a deeper meaning than just learning them for the moment, or learning something so that you can pass a test for a particular badge. Look at what you learn and what you practice, and then see in your life where you can apply those skills. The number of opportunities to use your Scouting skills may surprise you.

Monday, April 16, 2012

The Truth Will Set You Free? A "Recycled Scoutmaster's Minute"

I tried an experiment some years ago in making a blog that was going to be dedicated to my Scouting endeavors. Like so many other things, when i decided to put my focus on doing TESTHEAD, I decided to mothball that particular endeavor. I haven't looked at some of these things in years, but this week, as I was doing some "spring cleaning" on my blogs, I decided to look at this one again. I figured I'd rather not let these bits languish, and hey, who knows, they may well have a second life to them. So I'm going to, over the course of the next few days, present some of my old "Scoutmaster Minutes". These are things I would talk to my troop about after we finished our meetings, and they come from a number of different sources. Also, since the SummerQAmp initiative is now taking up a fair amount of my time, and the ages of the interns kinda' fits into the later Scouting and Venturing years, I figured they might be worthwhile to revisit for that purpose.

The Truth Will Set You Free?

Have you ever heard the phrase "The Truth Will Set You Free"? It's a good phrase, and it's an accurate one, but I'm going to add a little bit to that today. This quote has been attributed both to Werner Erhard and to Gloria Steinem, but in either case, the meaning is the same (modified from its original and paraphrased for the group :) ):

"The Truth Will Set You Free, But First It Will [Tick] You Off!"

Why is it important that I share this modified quote with everyone?! Because, again, it is accurate! We say that we want the truth, and we want to be told the truth about things, but there are times when we don't like to see or hear the truth, because it means we have to do something we don't like, or we have to face up to a fact that we don't appreciate, or in many cases (and I'm an authority on this one at times) we have to own up to the phrase "I am an idiot" or perhaps more appropriately "I have been behaving like an idiot!"

To be fair, everyone has moments where they have to face an uncomfortable fact, or own up to a truth that they don't like, because it forces them to see themselves in a light that's not always flattering. We as individuals have the power to get where we want to go and succeed at what we want to do. We get help along the way through friends, family, church leaders, and our Heavenly Father through prayer, but ultimately, none of those influences will get us where we want to be unless WE get us where we want to be. Whenever any of us thinks "Oh, I wish I could do this" or "Oh, I'm not able to do that" or "Oh, if only things were different"...

STOP!!!

... and ask yourself "Is that true?" If so, why is it true? 99 times out of 100 the answer is the person in the mirror. The truth is that we are often READY and ABLE to do great things, but at the end of the day, it's the WILLING that's not there. When we take the willing and put it into ACTION, then honestly, anything is possible!

So it's OK to face the truth about things, and it's OK to get angry about what you see, because it means we are ready to confront the truth and deal with it squarely. and then we are truly free, because we align ourselves and out actions with the truth. I encourage everyone, youth and adults alike... face your truth, get mad about it, decide to do something about it, DO SOMETHING ABOUT IT, and become free :).


Attribution: This minute was inspired by Larry Winget's book "Shut Up, Stop Whining and Get a Life"

STAR East: Here I Come!!!

My apologies for being so quiet the past week, but since I am going to be spending the next four days either in the air or at STAR East in Orlando, Florida, I had to focus on some other things. I did get to discover something really cool. Early last week, I received a check in the mail. When I saw that it was from Software Quality Engineering, I was curious to see what it was. The note for the check said "Best Paper", which I will interpret to mean that I received an award for the talk and paper that I presented as a Best Paper. I don't know if it is a Best Paper or the Best Paper, but either way, I'm excited to be picked for such a thing; it's another first for me :).

For those who will be at STAR East this week, I will be giving my presentation about "Delivering Quality One Weekend At a Time". If this title sounds familiar, you would be right. It was the paper that I originally wrote for the Pacific Northwest Software Quality Conference in 2011, but had to bow out of attending because of my broken leg and inability to walk well at the time. A co-presenter at PNSQC felt bad that I couldn't present my paper, and they contacted the folks at SQE and recommended they consider my paper as a talk. From there, I got the chance to have it submitted again, accepted, and now it's been voted a Best Paper. This tells me that I really have to deliver this talk, and now we are down to just a little over 48 hours until I get to deliver it :).

I'll be posting some details from the conference but I won't be able to do the extended "live blog" format I'm used to (they are not providing WiFi for the conference, and I'm one of those cheapo's who hasn't jumped on having an iPhone or Android yet ;) ), so my status reports will likely be posted nightly. As for now, it's one more mad dash day to get as much done on the home front before I leave. Hello, I must be going (LOL!).

Monday, April 9, 2012

Army of One: The Moment You Know You're Wrong

I had an interesting experience last week. As I was going through what was effectively a monster of a change related to our product, the product owner and I almost got into a shouting match. It wasn't really a shouting match, but there was a ratcheting up of mutual frustration. We felt like we were talking past each other. As I was trying to show what I was doing, the answer back was "no that's not what I mean".


As  I kept trying to clarify the information I was receiving, I felt my frustration levels rise, and I started asking "is there something I'm missing here? Can you tell me where I'm mistaken?" I said this with a little bit more bitter of an edge than I intended, but that's how it came out. I was frustrated, the product owner wasn't making sense, and I was doing what was asked... until I heard one little statement. "Go to your profile page, now do you see the two icons next to the user name? That..." and that's where it all trails off because in that moment, it clicked. All of the confusion melted away. I understood what was being said. What's more, I understood that my design and my approach was wrong. I was so caught up in the thick of the changes and the interface that used them, I had completely forgotten about the legacy manner in which we did the same procedures.

Lone testers and the Army of One have to shuffle a lot of things. We have to juggle a lot of balls at the same time. To that end, communication is vital. When we fail to communicate, or when we get too caught up in the immediacy of our testing, we run the risk of becoming myopic, and losing sight of the bigger picture. The good news is that this was relatively easy to resolve technically. Inter-personally, it required me to step out of the office, get some food, and then politely apologize to the product owner for getting out of hand and not being able to see what they wanted to have me see. We both realized that we were talking past each other, so that was certainly a situation we were able to resolve quickly, have a laugh about it, and get beyond it.

Still, it reminded me that, when I'm on my own and doing the testing on my own, there isn't another tester to confer with and make sure I'm not being dense. I have to watch out for those moments. If I'm alert and ready, they can be handled quickly and carefully. When we don't prepare for them, well, a chat over a plate of humble pie might be in order.

Tuesday, April 3, 2012

You Just Might Be... A Letter to My Inner Teenage Tester


Wow, the response to the SummerQAmp announcement has been awesome! Be warned, if you have said "count me in", I saw that, and I'm going to do so starting... right now :).

We have to start at the very beginning for some of these interns. Many of them may have no idea why they would want to be testers, or what the benefits would or could be. Thus, I'm starting with a letter to a teenaged tester who could be a lot like me, or not, but yeah, most likely, he's me:


Hey, Dude!!!

I know that most of your time right now is spent realizing that you are semi-photogenic, have a decent singing voice, and you really like taking that Fender Jazzmaster down to bare metal and building it back up. I also know that Like twinking around on that beige putty Macintosh that your parents set up in your brother's room because they can actually get into it (unlike your room), but seriously, I want to talk to you about something. At some point, you re going to put all your energies into being a musician. It's going to be a fun ride, it's going to impoverish you, and at some point, you are going to have to get down to brass tacks and dig your way out of a financial hole. When that day comes, you are going to find yourself surrounded by a strange breed of people, a wonderful breed. They are inquisitive, adventurous, creative and do some pretty cool things with other people's software. These people are called "testers" and believe it or not, you have a lot of the attributes they'd be looking for.


First of all, let's take a look at that guitar that you have taken down to its bits countless times. Why do you do it? I'm guessing because you want to se what makes it tick, and what changes and improvements you can make. Maybe it's just a cleaning or removing scratchiness from potentiometers, but the fact that you thought to break it down to look already points to you having the temperament of a tester.


Check out your reading material. You like to nerd out on unusual topics. You get a kick out of history. You enjoy philosophy. You actually own a book called "words from the Myths" because you had to dig in and understand why words like "Hubris", "Nemesis", "Chaos", "Cosmos" and "Sisyphean" meant what they did. You aren't really content with people telling you something is true, you needed to find out why stuff was true, or how they at least became accepted as true. There's an actual branch of study called "Epistemology" which means the study of knowledge and what we know, and why we know what we know. For some odd reason, you are into that. This makes a good potential pointer that you might be a tester in the making.


People are often defined by who they hang out with. You are hanging out with a bunch of musicians right now, and while you may think that that's all that you are, you're wrong. Many of you are drawn together because you share similar outlooks and ways of looking at the world. Among you are a CAD designer, an inveterate tinkerer and a rock solid network administrator, and that's just counting the guys in your first band. You don't realize that these are your future bandmates careers, but if you were to look at them for any length of time, it's self explanatory. But there's something that sets you a little bit apart. You like to argue why things work or don't work. You have a passion for doing things right. Not that they don't but you come at ir from a different angle. While they are focused on the actual building of things, you tend to look for the structural weaknesses and what could go wrong. You're not a pessimist... you're a tester :).


Can you count the hours that you and your bandmates looked for the optimal way to load and unload your gear, wire up your racks, maximize your wireless gear, and all of the other "refinements" you made while you were playing shows? Testing, my good friend.


I know that a lot of this might seem a little weird, and of couse, if I were to say that the reason you ultimately decided to go into testing was entirely temperament, I'd be lying to you. Those all pointed the way, but the real reason you decided to make it your life's work was the people that were part of it. People with names like Chuck, Marcia, Lawrence, Jim, Walter, Shannah, Monica, Beverly, Brian, Fredo, Art, Christina, Anthony, Greg, Don and many others. The temperament got you through the door. Your compatriots in arms are the ones that convinced you to stay. You owe a lot to them, too.


I'm going to close this for now, but I just wanted to give you some things to consider as you are plying your trade as a musician and trying to figure out what you want to be when you grow up. I'm not so sure I'm a good authority on the growing up part (by the way, I'm sorry for the broken ankle, lacerated wrist, splintered shoulder tuberosity and broken tibia and fibula you'll have to go through) but seriously, consider the testing avenue. You might be amazed at how much like "home" it feels when you get there.

Sincerely,
The Adult version of you (grown up? Not so much :) ).

Monday, April 2, 2012

The Temperament of a Tester and "The Announcement"


I think that Aaron Scott over at Two Leaf Clover must have been feeling the need to help me make an announcement today (LOL!). It's the only way I can explain how he comes up with certain comics right when I need them.


So, I know some of you have noticed these posts where I say I'm trying to identify "buddies" and that I'm writing letters to myself at 17, and that these can't be a coincidence. More to the point, you might want to ask me "why does this seem to coincide with an initiative that was recently written about, one that you initially commented dis-favorably towards? Surely there has to be a connection!"


Now I can tell you all that, yes, these are all interrelated. To put it plainly, I had some mixed emotions about the initiative called "SummerQAmp", and I commented on it. I suggested that we were doing a disservice by telling kids that they can do QA apprenticeships as a gateway into programming. Don't get me wrong, many people rise up through the testing ranks to become production level programmers,and I have nothing whatsoever bad to say about that. What I was lamenting was the fact that there is an industry and a craft that is noble unto itself, that of the software tester. Why are we trying to sell kids on it being a stepping stone? Why not encourage those with the aptitude to be testers to actually be testers, and enjoy the process? Like our intrepid friend standing and staring at the cube, he's got the temperament of a tester. The guy on the couch? Not so much. How can we get the guys (and girls) with the instincts to "punch the box" to see that those very skills are so very needed?


I'm happy to report that the people who are leading the SummerQAmp initiative listened, and they reached out to me, and to AST. We have reached an agreement to create content (lesson modules, but possibly other materials) geared towards helping intern aged students (those 16-24 year old) who might consider a career in the world of software development, and would like to try an apprenticeship in software testing, that we are looking to give them incentives to consider software testing a career in its own right.


Sounds like a monster initiative, doesn't it? It is! Sounds like an awful lot for one person to handle. You'd be right again, and there's no way in the world that I could do it on my own... but as I said, in a previous blog post, "I know a buddy who..." and I know there's a lot of you out there with experiences in many different industries. So what do you say? Are you interested in helping be someone who encourages the next generation of software testers? If there are areas you would have wished people told you about, would you share them? If there are situations and practices you would avoid like the plague, don't you wish someone had warned you when it could have been helpful? Well, here's your chance. Note: all ideas belong to their respective contributors. You create them, you own them, your words will be your own. Your copyright will be yours. Come help me find the people willing to "Punch the box". They need us, and frankly, we need them!

Sunday, April 1, 2012

If I Could Write a (Testing) Letter To Me

First, this is not an April Fools Joke. Just wanted to get that out of the way :).

Second, imagine that you had a chance to talk with kids who were in high school or, maybe, a little farther along, who wanted to consider a carer path. When I was growing up, there was a variety of experiences to be had in many different areas. There was technology like computers, but most of us who heard about working with computers saw commercials from places like Control Data Institute and the like. We didn't get much exposure to technology in school in the early 80s. The Internet boom was something I just happened to become part of in the early 1990s, and I was positioned with a company that was in a place to have a strong interaction with the development of the World Wide Web.

Still, I learned a lot about the world of testing in an environment that I just happened to become a part of. I didn't consciously discover that world until I was in my early 20s (23 to be more precise). Looking back, I wonder if I could have had an influence on my teenage self and helped guide him into the world that he works with and enjoys today.

Fans of current country music may recall the title I'm using as being the title of a song by Brad Paisley written in 2007. The lead off line is "If I could write a letter to me / and send it back in time to myself at 17". This is an idea that I want to flesh out over the course of this month... and I'd like to ask the testing world and my friends who test "If you could write a letter to yourself at 17, and the topic was 'Why should you consider software testing as a career, and if you did, what would you tell yourself about it'?"

For starters, I don't think this is a topic I could talk about in one simple letter. Granted, I don't think it would become as convoluted or long running as the plot for "How I Met Your Mother", but it would take awhile to likely build up to it. So I'm going to start with something very basic. I'd ask my former self what they think the world of computers is all about. My guess is, I'd still be thinking in terms of Control Data Institute with these big massive machines (mainframes with big spools of dot matrix printing paper being generated). I'd take the time to explain that while this is the common image of computing and computers, big behemoths owned and administered by a "templed few", around the corner a whole new world is being created.

Hey, were you aware of the fact that your dad writes programs every day? I'll bet you thought that Commodore 64 and TV hooked up on his desk was jut something he did to goof around. Turns out he's been writing programs for the pediatric ward at his hospital since the 70s, and that he'd been working on computer related stuff since he worked as a bank clerk in the early 60s while going to medical school. In truth, a lot of what he does in his medical practice is associated with that computer. If you are interested in getting a start in software testing, where genuine and seriously big risks could be found (we're talking potential life or death here) you might want to ask him how important software testing is.

In truth, I didn't realize what my dad did until I was well into my twenties and working as a tester. When I started hearing about the make or break testing initiatives done with medical software, and the vital importance of combination testing and having dosages calibrated precisely right (my dad worked often in the neo-natal ICU), and his homemade programs were used to calibrate the many medications required to be delivered to premature infants, the result of any mistakes could potentially lead to serious complications or possibly the death of a child. How would you have liked to have had a hand in testing that?!

The me of today looks back at amazing opportunities I could have had, and the challenges that were being faced in an as of yet relatively new field. What if I had taken the time in 1984, when the Macintosh computer that was brand new and delivered to our house a couple of short months after it hit the market? How cool would it have been to have learned more about testing with that device? Instead of just seeing it as a machine for making fliers for my bands, it could have become something much more, and maybe I would have been more a part of what was to become The Cult of Mac... or maybe not. We had an IBM PC's in the house as well. In 1984, my dad had a Commodore 64, a Macintosh and an IBM 8088 PC all running at the same time. If I had wanted to, I could have had a great introduction and a potentially great head start on the world of testing. The fact is, though, I didn't know I wanted to be a tester until many years later, when I "fell into" that role after a number of other events in my life. At 17, I wanted to be a fashion model. I wanted to be a musician. I wanted to be a ski instructor. I wonder if I would have even thought twice about being a software tester.

It's with this mindset and set of challenges that I'm now considering, how would I write this letter to myself at 17? How would I frame it? What would I want to say to help convince them that this may be an avenue they are well suited to consider? Why do we have to wait for people to fall into those roles? Couldn't we do a better job of talking about it up front? That's what I'm hoping to do this month, and I'm serious, I'd love to have some of your insights as to how I could frame those letters.

What's more, if you help me frame those letters and help me write them, I will give you credit for helping me do exactly that. Hmmm... it sounds like Michael is up to something. Why do I get the feeling these "letters" are more than hypothetical? It's because they are, but I can't tell you more than that until I get some more official word... and really, this isn't an April Fool's Joke, it just depends on comments from certain quarters coming out first before I can really say what I'm doing, but if you can help me write this "letter to me" (or to you, or anyone else who is in their late teens or early 20s), you'll be helping me make a giant step towards what my ultimate goal and plan is :).