Monday, August 12, 2013

Look at Competitor Products and Benchmark the Differences: 99 Ways Workshop #51

The Software Testing Club recently put out an eBook called "99 Things You Can Do to Become a Better Tester". Some of them are really general and vague. Some of them are remarkably specific.

My goal for the next few weeks is to take the "99 Things" book and see if I can put my own personal spin on each of them, and make a personal workshop out of each of the suggestions. 


Suggestion #51: Look at Competitor products and benchmark the differences -Sandeep Maher


One of the heuristic oracles that is often called upon to determine if a feature is behaving the way it should is to use the "consistent with Comparable products" aspect (that's one of the 'C's in HICCUPPS mnemonic.

What does it mean to be "consistent with Comparable products"? Does it mean we have to be exactly the same? No, but it means that we have to have some idea as to what our competitors in our space are doing, and understand what they do and what it costs them to do it. Additionally, we need to be aware of what it costs us to do similar actions.


Workshop #51: Take a crash course in competitive analysis. Learn what your competitors are doing, and get a handle how their environment runs in your space. Compare it to how YOUR environment runs in the same space, with as little modification as possible.


For anyone who has ever read Maximum PC or similar magazines, you are familiar with the "numbers" that are crunched for various components. You've seen those charts that are shown to explain which CPU, which motherboard, which RAM or which GPU is faster when running certain programs. This process is referred to as "benchmarking", and tries to compare components in systems with as much else being the same as humanly possible. 

Example: 

  1. Take a GPU card, plug it into a PC. 
  2. Run the necessary software so that the system recognizes the card (load drivers, etc.).
  3. Run a set of scripts. 
  4. Examine the results. 
  5. While changing as little as possible, unplug that GPU card and plug in another GPU card 
  6. You'll have to install the appropriate set of drivers for that card as well, but try to not change anything else if possible. 
  7. Run the same set of tests. 
  8. Examine the results. 
  9. Note the deltas.


That's a grossly simplified description, but that benchmarking and comparison is the basics of "competitive analysis". See what Factor A does when running, then compare to Factor B, while varying as little as possible any other factors.

What can this information tell us? It can show us where we may be lagging in comparison with another application. It can also show us where we excel compared to another application. 

Performing competitive analysis can give us some data points to consider, but it's important to make sure that we are keeping all things as consistent as possible. If our app has twice the physical disk space footprint as another app, or if our app uses resources another app doesn't use, then that will certainly skew the results in certain tests. Also, statistics can easily be skewed when we try to do a direct comparison of "apples and oranges". If an application is running "close to the metal", that's going to give us different performance than if we are offering a rich user interface to perform the same steps. 

There's also the additive and multiplicative effect. Comparing one or two users is going to be very different than comparing five hundred or two million. At that point, elements outside of our application's control are likely to be the bottlenecks (network throughput, etc.). 

When in doubt, try to compare similar processes within an application. Compare file open times. Compare save processes. Compare import and export of data. Mostly, try to not stress over things that are imperceptible. Start with the eyeball and patience tests. See what feels quick or slow just to your personal interaction. If you feel that certain functions take longer with one app vs. another, try to see if you can quantify it in "personal time" first. If you can perceive the product responding faster (or slower) chances are a customer will, too. From there, more formal benchmarking numbers and absolute data values can be examined.


Bottom Line:


Getting a feeling of how other applications in our space perform is important, but we need to be careful to not over analyze or make comparisons that are not legitimate. Also, it helps us bolster our case when we can point to how a competitor's product works. If our approach is significantly slower, or their user experience is richer, we can use that information to make a case that we might need to revisit how we perform those operations.

All in all its just a... 'nother tool in the tester's toolkit ;).

No comments: