Let's have a bit of a metaphysical question... what do our customers want? Do we know? Do we really know? We want to say that we know what our customers want but truly, how do we know what they want? Are we asking them? Really asking them? IF we are not looking at and trying to make sense of analytics, the truth is, no we don't. I mean, we may know what we think they want or what's important to them./ Analytics tell us what they really want and what they really do.
There's lots of neat tools that can help with this, there's of course Google Analytics, Adobe Analytics, and CoreMetrics. I have experience with using Pendo as well. Pendo is interesting in that it flags when customers actually use a particular page, function, or method. IT's a valuable tool for us to see what functions and features are really being used.
Let's look at the idea that analytics should be added to a site after it launches. On the surface, logical, but how about implementing them at the beginning of development. There's a lot of critical information you can discover and help your development by examining your analytics not just when a site is live but also as you are putting it together. What development is influencing your most critical areas? Your analytics may be able to tell you that.
Another thing to realize is that analytics do not actually tell you anything by themselves. You may need to do some timed analysis and aggregating to actually get the real picture. One day's data may not be enough to make a correct analysis. Analytics are data points. You may need to do some specific analysis to determine what actually matters.
So how can we look at analytics from a testers perspective? Amanda suggests using Charles Proxy or Fiddler, or you can use a variety of browser plugins that can help you look at the data your analytics collect. These can look really spiffy and it's cool to look at data and what does what when. However, there are a variety of ways that this data may be misleading. My blog has statistics and analytics that I look at on occasion (I've learned to spread out when I look at them, otherwise I get weirdly obsessed at what is happening when. Also, when I live blog, my site engagement goes through the roof. It's not because everyone suddenly loves me, it's because I just posted twelve plus posts in the last two days (LOL!) ).
One of the most infuriating things to see is when I go back and notice a huge spike in my data. If it corresponds with a live blog streak, that's understandable. What's not is when I have a huge spike when I haven't been posting anything. What the heck happened? What was so interesting? Often it means that something I wrote was mentioned by someone, and then BOOM, engagement when I'm not even there. That happens a lot more often than I'd like to admit. I'd love to be able to say I can account for every data spike on my system but much of the time, I can't, just because it happened at a time I wasn't paying attention and also because it's not necessarily my site doing the work, it's someone else somewhere else causing that to happen (usually through a Tweet or a share on another platform like LinkedIn, Instagram, or Facebook).
Again, analytics are cool and all but they are just data. It's cold, unfeeling, dispassionate data. However, that cold, dispassionate data can tell you a lot if you analyze it and look at what the words and numbers actually mean (and you may not even get the "actually" right the first few times). Take some time and look through the details that the data represents. Make experiments based on it. See what happens if you roll out a feature to one group vs another (A/B testing is totally irrelevant if metrics are not present).
Analytics can be nifty. they can give you insights, and you can make decisions based on what's provided but analytics by themselves don't really do anything for you. They are just data points. It's the analysis and consideration and critical thinking that's performed on the data points that really matters.