Wednesday, November 13, 2019

Getting Practical with Practitest: A Broad Sweep at a Broad Product

Before I get started, a full disclosure. I was asked by friends at PractiTest if I would be willing to have a look at their test management software and talk about it on TESTHEAD. To that effect, I said yes. Beyond that, they have no influence on what I say and post here. I have given them permission to use my comments as they choose to so long as they reference this original article here.
And with that out of the way... :).
I must confess, I have done fewer product reviews with my blog over the years than I had thought I might do. Part of it comes down to a fundamental issue; what I use a product for may be very different than how others might use it. Still, I think it's safe to say that many of us have some similar goals and attitudes about the tools we use and how we might take advantage of them. On the whole, I am a fan of products that do their best to help guide the user with their use, do not bury arcane but useful features under layers of difficult to arrive-to processes and reward a user with regular and frequent interaction. Thus, it is with that idea and approach that I' going to look at what can be either one of the most helpful or most frustrating areas of software testing, that of test case management.

I have worked for a variety of organizations over the years and the methods of test case management have varied radically. Some places have literally only used index cards that get shuffled around (no joke, there were no permanent electronically documented procedures, or very few if I was being totally honest. Great from a free and exploring point of view, terrible in that cards just plain got misplaced and lost (and yes, that happened more times than I want to admit, best intentions notwithstanding). 
On the opposite end, I have also worked with very top-heavy documentation systems that were a chore to use and so burdensome that it felt like most of my testing time was spent writing up test cases and leaving me precious little time to actually test and explore systems. Wouldn't it be nice to have something in between? Something that allows for a less rigid and formalized structure if you don't need it but has the flexibility and power of documenting and traceability if you do? 
Practitest aims to do that. I aim to see just how well they pull that off ;).



What is Practitest?


First off, Practitest is an online, SaaS product offering. You access the product via the web and you interact with it via web browser. However, it may be easier to think of Practitest as an onion with layers of functionality (cue Shrek jokes, in 3, 2, 1...). However, that layer metaphor is quite useful,m in my opinion. By looking at each level and layer, it's fairly easy to get from one part to the other and actually have a handle on how everything interrelates.



The system is structured in several key areas. You have the ability to define your test requirements, store and keep tests that you develop, compile and follow test runs, track issues, and create reports, as well as customize and integrate with other products if desired. For today, let's just focus on the core product and getting productive with it.

Our Sample Test Flow
Let's take a simple example of logging into the system...

side tangent, have you ever noticed that most tutorials tend to focus on a log in screen? This stands to reason because it is often the first thing anyone does with a product unless there isn't a login required. Does it make sense to start there? Maybe not but it is easier to describe rather than walk through an entire installation script or break down every component of a search test.

In this case, there are a few interesting wrinkles that logging in with the product I am testing. We have a standalone login, an LDAP login and we also use a single sign-on built around iPaaS. each of those deals with different paths in the code and in certain scenarios takes us different places.

Step one, let's create a requirement(s). it can be as simple or as complex as we want. In my case, I want to just define what I hope to do and how I intend to accomplish that task (in this case, three login types):

- standalone

-LDAP

- iPaaS

Each of these we would be expected to create a separate test case but one requirement would suffice. Fortunately, the requirements section is the first thing that we interact with and adding information is quick and direct.


By adding requirements, we can capture in a little or a lot of detail what we want to cover. from there, we can go to test cases and define what we want to test. Practitest has two modules that allow for scripted or exploratory testing. I want to say that I appreciate the fact that Exploratory tests are built-in as an option because in many environments, there are questions as to what Exploratory testing is or means and a bias occurs because they are not as readily documented. Practitest handles that nicely. Also, if you need to get into the details, you can do that as well with the scripted tab.




Additionally, if there is a need for procedural and specific test cases to be recorded, those can also be listed in the scripted section.





A nice thing that is also included is the ability to include test cases that have already been written. Do you have steps you find yourself doing over and over again? Don't write them out again and again, include them where needed. this also is helpful as you start to consider which tests can be automated. as you get the hang of writing things out, you notice where you can simplify or determine repetitive steps. saving those as what I refer to as "test snippets", you can call them up and use them as often as needed.

Connecting the Dots
As I work my way across the toolbar, I see that I can trigger areas and examine what I am testing and writing down as I move from left to right. This flow is straight-forward and doesn't bog the user down in a lot of details or needless repetition. If I have to highlight a great strength of PractiTest, it's that it doesn't force me to be repetitive needlessly. Additionally, it allows test sets and test runs to be as specific or as general as desired. is the testing exploratory? Set a timer and go. capture and post what you find. Is the test a procedural one? Great. Run it and then see how long it takes to do it (the timer runs automatically in the background). When all is said and done, you can get a compilation of test runs, what was accomplished, what was recorded and what requirements those tests are associated with. Found issues, log them in the dedicated issue tracker, track them back to test, which trackback to requirements (again, right to left on the board). If I was a test manager, I would find this to be easy to understand, easy to track and it wouldn't take me forever to figure out what was going on.

Overall Thoughts

An impression I keep getting as I use Practitest is that the company values testers' time and appreciates that the best thing they can do as a tool is to provide the feedback necessary when necessary and get out of the way when it doesn't need to be there. Short of being specifically macro'd into a product where a right-click could allow a user to track what they are doing in real-time within an application, Practitest does a nice job of staying out of my way and I mean that as a compliment. Granted, this is the first blush and there's a lot of stuff I could talk about (if interested, I can go on, believe me ;) ). On the whole, however, out of the variety of test management tools I have used to date, I think Practitest is clean, unobtrusive and flows nicely with how I particularly like to work.

No comments: