Tuesday, October 12, 2021

The Testing Team Have Requirements Too with Moira Tuffs (#PNSQC2021 Live Blog)

 

All too often we see requirements being used for development, customer feature needs, business requirements, etc. To use Moira's company example: 

"speed improvements of 30%" 
"user should see plain English error messages." 

A big question right here... how do we actually confirm/verify that we can actually do/meet these requirements? In short, this hearkens back to one of my previous presentations... "Is this Testable?" 

We expect a 30% speed enhancement... how are we determining the fact that this 30% improvement has actually happened. 30% faster than what, exactly?

Basically, we as testers need to know what the baseline is and how to measure it. We need to have the initial hypotheses and understand them (it seems hypotheses is going to be my favorite word this conference (LOL!) ). 




I can feel this deeply in my own organization. My current test role has me being the "newbie" of the team. I've worked with them for over 18 months now and yet it seems in every story there's some new area that I need to know about that I have never even heard of. My team has been together and working on our area of expertise (data transformations) for more than a decade. Yes, my team members outside of myself and our scrum master have a decade-plus of working with each other. That means there is a tremendous amount of explicit and implicit knowledge locked up in those brains that get taken for granted and I am the one who has to try to figure out what piece of implicit knowledge is missing. 




This is interesting as Moira is hitting a lot of the areas that I remember from doing my "Is This Testable?" talk. I am in full agreement that log files can be an absolute gem in helping define and highlight implicit requirements or areas that we should be paying attention to. It may take some time to get to understand what information is relevant and interesting, and there may be many log files to have to interact with. Here's where I recommend firing up your favorite screen multiplexer and cut up a screen window to tail a variety of log files. This gives us a chance to see if we are actually seeing the data that is relevant given the transaction at hand. 



Moira is using an example that has both a software and a hardware focus. The software can be squishy but if the hardware doesn't work right, there are real issues that may need a more direct interaction. It might be as simple as clearing a nozzle for the material to flow through (this is an example using a 3D printer as the hardware). It may be different where the software is specifically the differentiator. This is something I often deal with in my home studio and with my audio interfaces. I have multi-input systems with multiple outputs that can be routed. Thus for me to be effective, I have to have a firm understanding of the control software that allows me to create signal routes on the fly. Since these input and output routing values are not going to always be in place, there's a lot of interaction, setup, and teardown to be able to route effectively for different sessions and purposes. I'm mentioning this to mainly say that I feel for my testing kin over at Focusrite (and thank you for what you all do to make this less odious than it often can be :) ).


No comments: