Thursday, April 12, 2018

Docker and the Path to a Better Staging Environment - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

Wow, time flies. I'm in the last session of the conference, and while I've had to split my time and focus on work stuff today, I've still been able to participate more than I thought I would. I have a couple of additional entries I'll make later (so they don't really count as "Live Blogs" but I still want to get them down).

My current work environment uses Docker extensively for a number of things. Perhaps most specifically we use Docker for massive parallelization of our automated tests and for our CI?CD system. Docker has been around since 2013 and it was at that time that we implemented our parallelization strategy. A fair amount of Docker was still experimental at that time, so we implemented some creative methods to get Docker top do what we wanted it to. We accomplished our goal, but Docker has matured a lot in five years. THus I wanted to get a better feel for the present state of Docker. Gil Tayar is providing that opportunity :).

To be frank, there is no way I am going to be able to do justice to what Gil is covering here as he is typing fast and furious. I am, however, getting a better appreciation for how Docker actually does what it does. I have wondered what the port mapping actually did and why it was so important. Seeing it live is pretty cool. It's transactionally very fast.

Little things I am learning that are quick hits:

- Kubernetes is a large scale docker coverage tool. It means "captain of the ship" in Greek.
- #K8s is short for kubernetes. Now I finally understand what that means :).
- There's a lot of stuff we can do with Docker that my current implementation is not doing. I need to do some digging in the dirt for this.

And with that, I'm going to say "adieu" to old friends, new friends, and to those who I haven't met yet... there's still time :). Please say "hello" before you leave. Just look for the guy with the elbow and wrist brace. I'm pretty sure I'm the only STP-CON attendee that has that distinction :).

Talking About Talking - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

Any time I attend a conference, I tend to go with 70% new content and about 30% familiar speakers. Over time, I've found it harder to look for new people because many of the people I get to know get asked to repeat present at conferences. With that out of the way, I consider Damian Synadinos a friend, but I picked his discussion for a specific reason. While I think he is intending for this to be about public speaking, I'm looking to see how I can leverage public speaking focus on my own active company interactions.

Why do I speak at conferences, at meetups, or at events? There are a variety of reasons but if I have to be 100% honest, there are two reasons. The first is wholly professional. I want to develop credibility. I want to show that I walk the talk as well as that I know at least an aspect of something. The second is personal and depending on how well you know me, this is either a revelation or so obvious it's ridiculous. I'm an ex-"RockStar". I used to spend a lot of time entertaining people as a singer and I loved doing it. I get a similar rush from public speaking, especially when people say they actually enjoy what I'm talking about and how I deliver the messages I prepare.

Part of talking in public is the fact that you are putting your ideas out there for others to consider. That can be scary. We own our ideas and our insecurities. as long as we keep them to ourselves, we can't be ridiculed for them. In short, people can't laugh at us for the things we don't put out there. Personally, I think this is a false premise. I have had people laugh at what I presented, but not because what I was saying was foolish or made me look dumb. Instead, it was because what I was talking about was in and of itself funny (actually, more absurd) but people laughed because they could relate. To date, I have not been laughed at. Laughed with, lots of times. If you are afraid people will laugh *at you*, let me reassure you, it's hugely unlikely that will happen.

The biggest reason why I encourage getting out there and speaking is that our ideas deserve to be challenged and we should want to challenge our ideas. Also, we may never aspire to get on a stage and speak, but all of us participate in meetings or presentations at work, in some form or another. By getting out there and speaking, we can improve our ability to function in these meetings. 

Something else to consider for a reason to give a talk or speak in public is what I call the "ignorance cure" for a topic. It's wise to talk about stuff we know about, but once a year or so, I will deliberately pick something I don't know much about or that I could definitely know more about. When I do this, I try to pick a timeline that gives me several months so that I can learn about it in a deeper way. People's mileage may vary with this, but I see a definite benefit from doing this.

Not every talk idea is going to be amazing. Not every talk idea is going to be some revolutionary idea. Truth be told, I'm lousy at revolutionary things. I'm highly unlikely to be creating the next big anything. However, I am really good at being a second banana to take an idea someone else has and running with it. Don't be afraid that something you want to talk about isn't new. We aren't born with ideas, and most of the time, we stand on the shoulders of giants.

My recommendation to anyone who has any interest in public speaking, no matter how small, is to borrow from Morrisey... "Sing Your Life". Talk about your experiences, as they will always be true and real. You may not be an expert on a topic, but you are absolutely an expert on *your experiences* with that topic. Also, if anyone wants to get up and talk, let me know. I'd be happy to help :).

Release is a Risky Business - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

Good morning. Let me get this out of the way now. I'm going to be splitting my mental energy between attending STP-CON and the fact that a new candidate release dropped last night and I need to see how close to shippable this one will be. Splitting my brain is an everyday occurrence, but that may mean I might miss a session or two, but I'm not missing this first one ;).

Simon Stewart is probably well known by those who read my blog regularly. Web Driver guy and a bunch more.  We're talking about the changes and the way that "release" has morphed into faster releases, along with a greater push to automation. Some stuff that fell by the wayside in that change is honestly a lot of stuff that I don't miss. There's a lot of busywork that I am glad has been taken over by a CI/CD pipeline.

Outside of software testing, my professional life has become very closely knit into release. Jenkins is my baby. It's an adopted child, maybe a foster child, but it's still my baby. As such, I tend to spend a lot of time fretting over my "problem child", but when it works it is quite nice. Remind me when I am over the PTSD of the past month just how sideways CI/CD can go, but needless to say, when a tester takes over release management, they go from almost invisible to ever present.

Release is risky, I can appreciate that greatly, especially when we get closer to a proper release. Though I work in an environment where by necessity we release roughly quarterly, in our development and staging environment, we aim to be much "Agiler" and closer to an actual continuous environment (And yes, I checked "Agiler" is a real word ;) ).

Simon points out, and quite rightly, that release is really less about quality and more about risk mitigation. For that matter, testing is less about quality than it is risk mitigation. For that matter, staging environments do not really give you any level of security. Simon makes the point that staging environments are a convenient fiction but they are a fiction. My experiences confirm this. About the only thing a staging environment tells you is if your feature changes play well with others. Beyond that, most staging environments bear little to no resemblance to a proper production environment. If you think Simon is encouraging releasing and testing in production, you would be correct. Before you have your heart attack, it's not the idea of a massive release and a big push of a lot of stuff into production and all bets are off. If you are going to be doing frequent releases and testing in production, you have to think small, get super granular and minimize the odds of a push being catastrophic. Observability and monitoring help make that possible.

There's a lot that can go wrong with a release and there's a lot that can go right with it, too. By accepting the risk and doing all you can to mitigate those risks, you can make it a little less scary.


Wednesday, April 11, 2018

The Use and Abuse of Selenium - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

I realized that the last time I heard Simon speak was at Seleimum conf in San Francisco in 2011. I've followed him on Twitter since then, so I feel I'm pretty well versed with what he's been up to, but the title intrigued me so much, I knew I had to be here.

Selenium has come a long way since I first set my hands on it back in 2007.  During that time, I've become somewhat familiar with a few implementations and bringing it up in a variety of envirnments. I've reviewed several books on the tools and I've often wondered why I do what I do and if what I do with it makes any sense whatsoever.

Simon is explaining how a lot of environments are set up:

test <-> selenium server <-> grid <-> driver executable <-> browser 

The model itself is reasonable but scaling it can be fraught with disappointment. More times than not, though, how we do it is often the reason it's fraught with disappointment.  A few interesting tangents spawned here, but basically, I heard "Zalenium is a neat fork that works well with Docker" and I now know what I will be researching tonight after the Expo Reception when I get back to my evening accommodations.

Don't put your entire testing strategy in Selenium! Hmmm... I don't think we're quite that guilty, but I'll dare say we are close. Test the happy path. Test your application's actual implementation of its core workflows.

Avoid "Nero" testing: what's Nero testing? It's running EVERYTHING, ALL THE TIME. ALL THE TESTS ON ALL THE BROWSERS IN ALL THE CONFIGURATIONS! Simon says "stop it!" Yeah, I had to say that. Sorry, not sorry ;).,

Beware of grotty data setup: First of all, I haven't heard that word since George Harrison in "A Hard Day's Night" so I love this comment already, but basically it comes down to being verbose about your variables, having data that is relevant to your test, and keeping things generally clean. Need an admin user? Great, put it in your data store. DO NOT automate the UI to create an Admin user!

Part of me is laughing because it's funny but part of me is laughing because I recognize so many things Simon is talking about and how easy it is to fall into these traps. I'm a little ashamed, to be honest, but I'm also comforted in realizing I'm not alone ;).

Modern Testing Teams and Strategies - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

One of the fun part of being a "recidivist conferencist" is that we get to develop friendships and familiarity with speakers we get to see over several years. Mark Tomlinson and I share a ridiculous amount of history both in the testing world and personal endeavors, ao I always enjoy seeing what he is up to and what he will talk about at any given event. This go-around, it's "Testing Teams and Strategies" so here we go...

Does it seem common that the people who decide what you do and how you do it have NO IDEA what it is you actively do? I'm fortunate that that is not so much the issue today but I have definitely lived that reality in the past. It's annoying, to be sure, but often it comes down to the fact that we allow ourselves to be pigeonholed. The rate of change is insane and too often we think that we are being thrown into the deep end without a say if what happens to us. If we don't take some initiative, that will continue to happen to us.

I've had the opportunity over the past (almost) three decades to work in small teams, big teams, distributed teams, solo, and freelance. Still, in most of my experiences, I've been in what I call part of "the other" organization. That's because I've worked almost exclusively as a tester in those three decades (my combined time as a cable monkey, administrator and support engineer equals up to less than four total years and even in those capacities I did a lot of testing). Point being, I've spent more time as part of another organization that has been siloed. It's a relatively new development that I'm working on a team that's both small enough and focused enough where I'm actually embedded in the development team now. As a point of comparison, my entire development team is seven people; three programmers, three testers, and one manager. Really, that's our entire engineering team. That means that there is too much work and not enough people for anyone to be siloed. We all have to work together and in reality, we do. My role as a Tester has dramatically modified and the things I do that fall outside of the traditional testing role is growing every day.

if I had to put a name on our type of team, I'd probably have to describe us as a blended group of "Ronin", meaning we are a relatively fluid lot with a host of experiences and we are ultimately "masterless". If something needs a champion, it's not uncommon for any of us to just step up and do what's needed. The funny part is Mark just put up the "non-team testing team" and basically just defined what I just wrote. Ha!!!

OK, so teams can be fluid, that's cool.  So how do we execute? Meaning, what is the strategy? To be clear, a strategy means we deliver a long-term return on investment, align resources to deliver,  arrange type and timing of tactics and make sure that we can be consistent with our efforts. Ultimately, we need a clear objective as well as a strategy to accomplish the objective. Sounds simple but actually being concrete with objectives and developing a clear method of accomplishing them is often anything but. In my opinion, to be able to execute to a strategy, we have to know what we can accomplish and what we need to improve on or develop a skill for. Therefore a skills analysis is critical as a first step. From there, we need to see how those skills come into play with our everyday activities and apply them to make sure that we can execute our strategy with what we have and develop what we need to so that we can execute in the future.


(still editing, refresh to see the latest :) )

More than That - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

I had to step out and take a meeting so I missed probably half of Damian Synadinos' talk. Therefore, if this feels incomplete and rambling, well, that's because it literally is ;).

I am intimately familiar with being asked "what I do" as well as "who I am". The fact is, I am a lot of people. No, I don't mean in a schizophrenic sense (though that's debatable at times). I mean it in the Walt Whitman sense:

"Do I contradict myself? Very well, then I contradict myself, I am large, I contain multitudes."

The point is that we are never just one thing. All of our quirks, imperfections, and contradictions come from our many experiences, histories and active pursuits.

Depending on who you talk to about me, you might get a wildly interesting view of exactly who I am. It might get really interesting depending on what period of my life you ask about, but if I had to guess, these identities might show up:

actor
bass player
bodybuilder
boy scout leader
carpenter
cosplayer
dancer
drummer
father
fish geek
gardener
guitar player
husband
mandolinist
mormon
obsessive music fan
otaku
photographer
pirate
podcast producer
poet
programmer
prose writer
singer
snowboarder
tester
video gamer
yogi

If I had to choose a specific attribute, I'm going to lay claim to "eclectic".

Each of these has informed my life in a variety of ways, and each of them has given me skills, interests and a number of very interesting people to do this variety of things with. In many ways, it's the people that I interacted with that informed how long or how little time any of these endeavors/attributes have been a part of my life, but all of them are and all of them have provided me skills to do the things I do.

Also, if any of the items on that list have you wondering what they are or how I'm either actively involved in or why I chose to mention them, please be my guest :).

Performance Test Analysis & Reporting - a 1 1/2 armed #LiveBlog from #STPCON Spring 2018

One of the factors of performance testing that I find challenging is going through and actually making sense of the performance issues that we face. It's one thing to run the tests. It's another to get the results and aggregate them. It's still another to coherently discuss what we are actually looking at and how they are relevant.

Mais Tawfik Ashkar makes the case that Performance Analysis is successful when people actually:

  • read the results
  • understand the findings
  • can be engaged and most important 
  • understand the context in which these results are important


Also, what can we do with this information? What's next?

Things we need to consider when we are testing and reporting, to be more effective would be:

What is the objective? Why does this performance test matter?
What determines our Pass/Fail criteria? Are we clear on what it is?
Who is on the team I'm interacting with? Developers? BA? Management? All of the Above?
What level of reporting is needed? Does the reporting need to be different for a different audience (generic answer: yes ;) )

What happens if we don't consider these? Any or all of the following:


  • Reports being disregarded/mistrusted
  • Misrepresentation of findings
  • Wrong assumptions
  • Confusion/Frustration of Stakeholders
  • Raising more questions than providing answers

Mais starts with an Analysis Methodology.  Are my metrics meaningful? Tests pass or fail. Great. Why? Is the application functioning properly when under load/stress? How do I determine what "properly" actually means? What are the agreements we have with our customers? What are their expectations? Do we actually understand them, or do we just think we do?

By providing answers to each of these questions, we can ensure that our focus is in the right place and that we are able to confirm the "red flags" that we are seeing actually are red flags in the appropriate context.


Tester or Data Scientist - a 1 1/2 armed #LiveBlog from #STPCON

Smita Mishra is covering the topic of  "Tester and Data Scientist". Software Testing and Data Science actually have a fair amount of overlap. Yes, there is a level of testing in big data but that's not the same thing.

A data scientist, at the simplest level, is someone who looks through and tries to interpret information gathered to help someone make decisions.

Website data can tell us what features people engage with, what articles they enjoy reading and by extension, might help us make decisions as to what to do next based on that information.

An example can be seen on Amazon. About 40% of purchases are made based on user recommendations. the Data Scientist would be involved with helping determine that statistic as well as its validity.

Taking into consideration the broad array of places that data comes from is important. Large parallel systems, databases of databases, distributed cloud system implementations, aggregation tools, all of these will help us collect the data. The next step, of course, is to try to get this information into a format to be analyzed and for us (as Data Scientist wannabes) to synthesize that data into a narrative that is meaningful. I find the latter to be the much more interesting area and for me, that's the area that I'm most interested in learning more about. Of course, there needs to be a way to gather information and pull it down in a reliable and repeatable manner. The tools and the tech are a good way to get to the "what" of data aggregation. Interacting with the "why" is the more interesting (to me) but more nebulous aspect.

So what do I need to know to be a Data Scientist?

  • Scientific Method is super helpful.
  • Math. Definitely, know Math.
  • Python and R both have large libraries specific to data science.
  • A real understanding of statistics.
  • Machine Learning and the techniques used in the process. Get ready for some Buzzword Bingo. Understanding the broad areas is most important to get started.

Recommended site: Information is Beautiful

The key takeaway is that, if you are a tester, you already have many of the core skills to be a Data Scientist. Stay Curious :).

Shifting to Quality Engineering - STP-CON 1 1/2 Handed #LiveBlog

Melissa Tondi is leading off with a keynote discussion about Quality Engineering (QE) and what it specifically means. QE is a mindset and an attitude.

These components help to make this shift:

- explicit and implicit info gathering
- tools and tech to craft a solution
- executing to the solution

Some interesting things to consider:

Ever individual owns the quality of the work they produce. They do NOT own the quality of the overall project. In short, we can only guarantee quality in the areas that we have direct control over.

Some changes are already happening, and there's a definite debate as to their validity:

- testing is not dead, but it is certainly changing.
- automation will not replace all testing, but it is a huge plus for the mind-numbingly repetitive tasks.
- AI and Machine learning will make testing irrelevant (I'm not holding my breath on this one at all ;) ).

Some of the shifts that I have seen in the testing world are moving from an ad-hoc approach to an ISO 9000 specific standardization. Good or bad, that was a change I lived through and I found some of it useful and a lot of it needless overhead. Later, I worked in much smaller teams where such an approach was complete overkill. Needless to say, I adapted to what those organizations needed. Technologies that were part of my everyday test environments (virtual machines, scripting languages, etc.) morphed into actual production environments. Agile became a mantra and something everyone said they advocated and championed, but actual implementation was wildly different in each company.

The point is that every year has given me something new to look at and consider. Every year some different approach has taken precedence and grabbed people's attention. I have been an enabler of good things and some dumb things as well. The key is that we can make the choice as to what we put forward and what we want to contribute to the process.

Some areas that we can make improvements and helping to make a QE shift are:

- showing our value
- putting ourselves into the circle of influence
- leveraging technologies and taking advantage of them
- locating inefficiencies and figuring out how to reduce them
- owning our own quality and letting other people own theirs
- stay curious, look for the implicit information and help make it more explicit.

Another good point Melissa makes and that I agree with is that this shift is more than just changing our work environment, but the broader community that we are part of. while it may not be practical for everyone to be a speaker or content creator, everyone can help change and build the narrative at a broader level. Engage with your broader testing community whatever and wherever it may be.

STP-CON: A One-and-a-Half Handed #LiveBlog

Hello everyone! Greetings from Newport Beach, CA.

This session of the Testhead Live Blog will be a little different this year for a couple of reasons. The fundamental difference is that I am down the significant use of one arm. I had elbow surgery a couple of weeks ago (lateral epicondylitis for those who like specifics). It involved using ultrasound to pulverize a bone spur, perforate a tendon in my elbow so it looks like a slice of Havarti cheese and an insertion of blood serum from my other arm to give the tendon the optimal chance of healing. The net result is that I have a lovely wrist brace and elbow cuff that limits motion and makes typing with my left hand... less than optimal.

My loss is your (potential) gain. It means my trademark verbosity (I am a wordy fellow,. let's face it ;) ) will have to be diminished a bit. It also means I get to use some Accessibility features that I test for and advocate using. Some of these posts will be verbally generated, which should be... interesting.

I'd like to take the chance to first off say thank you to the attendees of my workshop yesterday. I taught a session on "Designing and Testing Inclusively" and it gave me a chance to expand on material and approaches that I usually get less than an hour to talk about or demonstrate. It also gave me an opportunity to explain why I chose the tools that I did, what the tradeoffs are for using them and an unexpected but nice detour. Towards the end of the session, we had a philosophical discussion about advocating for what can at times be conflicting suggestions. When does our advocacy for one goal mean that we are making someone else's experience diminish? Can we go too far? Additionally, how can we help people to care a little more about this?

Today is the first full day of the conference, and I will probably have to take a little more time to compose my thought than normal, so the rapid-fire multi-post process I'm known for may not happen this go around. Still, I'll do my best, so stay tuned :).