Friday, May 6, 2022

Performance-Driven Development: An #InflectraCON LIve Blog

As was once written, all good things must come to an end and as I have to do some interesting maneuvering to make sure I don't arrive late for my flight, this is the last talk I will be attending and my last missive for InflectraCON. It's been a lot of fun being here and here's hoping they'd like to have me back again next year :).

For the last talk, I'm listening to Mark Tomlinson talk about Performance (What? Shocker! (LOL!)). Specifically,  he's talking about Performance Driven Development. Sounds a bit like Test-Driven Development? Yeah, that's on purpose.

The idea behind Test-Driven Development (TDD) (Beck 2003; Astels 2003) is "test-first". You write a test, you then write just enough production code to PASS that test,  then refactor the code. Just as Testing has "shifted left", performance is likewise shifting left. 

Can the adaptive concepts used in TDD be applied to performance engineering and scalability? The answer is yes and the result is "Performance-Driven Development (PDD). 

In short,  we need to think through the non-functional requirements and system design and all of those "ilities" before we write any code. In other words, as we develop our features, we need to not just pass tests but need to have the performance constraints defined and confirmed as development progresses.

Intriguing? Yes, but I'm wondering how this can be effectively applied. Part of the challenge that I see is that most of the sites I have seen use TDD tend to do so in a layered approach. We start with small development systems and then expand outward to demo, staging, and then production (at least where I work). It would be interesting to see how PDD would scale on each machine. Is there an obvious point where one would be seen to be working well and then as we make the step to jump up from demo to staging or staging to production, what tier do we see issues (or do we see issues immediately)? I confess that in many cases the performance enhancements happen after we've delivered the features in question. Often, we have realized after the fact that an update has had a performance hit on the system(s). The next question would be where we are able to put Performance testing into the initial development. I recall from many years ago how Selenium and JMeter can be an interesting one-two punch when developed in tandem, so it's definitely doable (whether or not concurrent Selenium and JMeter development makes sense, I'd have to say "your mileage may vary" but it is something I can at least wrap my head around :) ).

This seems like something that we might be able to address. I can only imagine my manager's face when I bring this up next week when I'm on our regular calls. I can imagine him just shaking his head and face-palming with "oh no, what is Michael going on about now?!" but hey, I'm willing to at least see if anyone else might be interested in playing along. Time will tell, I guess.

And with that, it's off to see just a little bit more of D.C. as I make my way back to National. Thanks InflectraCON and everyone who attended and helped make it possible. It's been fun and I've enjoyed participating. 

Until we meet again :)!!!

No comments: