Performance is a Feature

What do I mean when I say “performance  is a feature?”

For a long time, I got this wrong. When I explained myself, I’d say that performance was as important as any other feature and worth spending as much time on as any other feature, and you shouldn’t trade it lightly, like you wouldn’t trade any other feature lightly.

The thing is, especially on a small team, you might not come back to any particular feature for a few months. So, would this mean you only come back to performance every few months?

Thanks to Mike Brittain at Etsy, I’ve figured out just how wrong I was.

Performance is a feature, and just like any other feature, it must be continuously monitored and tested. What happens when a test breaks or a regression is found in any other feature? Regressions are usually considered top-priority bugs. The same must be true for performance regressions. Just because your small development team isn’t focusing on a particular feature at the moment doesn’t mean it’s OK to break it.

Performance testing isn’t exactly like most feature testing, but that doesn’t matter. One of my favorite statements from a QA lead boiled down to “don’t get hung up on the tool, focus on what you’re assuring.” Use the tools—like graphs, external and on-site performance monitoring—you need to use to make sure performance doesn’t regress.