a railway bridge supported by brick arches surrounded by lush green jungle
Unrelated image from pexels.com to make this post look nicer in social media shares.

So I was browsing workplace.stackexchange.com questions late one night, as you do, and came across an interesting one: How to explain business priorities to a programmer.

That one didn’t go at all the way I thought from the title. It turns out the business priorities actually did make sense (the business had this wild idea that queries, even especially complicated ones, should take less than 4 hours to run) and the programmer was being unreasonably rigid in following “best practices”.

But weirdly, one of the answers said that both parties were wrong – that the programmer shouldn’t have been so attached to having nice looking code and the manager shouldn’t have insisted on using an anti-pattern. What? The anti-pattern (using raw SQL instead of the ORM) improved performance by multiple orders of magnitude! If performance matters at all, you cannot possibly tell me that massively improving performance was the wrong thing to do.

Okay, there is some nuance there :) If your ORM performs that badly it’s likely not configured right for your situation, so it might be better in the long term to fix your config than to keep writing custom SQL for every query that’s the least bit complicated. On the other hand, if you don’t have an expert handy who can tune your ORM config, it might be easier to ditch it, at least for the more complicated queries, and just write some SQL. Or hey, how about a compromise? Use a stored procedure, that’s what they’re for! If it’s a query you run a lot, it might even make sense to make a view for it.

No matter what you end up doing, it’s still not wrong to use a supposed “anti-pattern” if you have a good reason to do it. It’s the context that makes something a bad idea, even the best idea can turn bad if you use it the wrong way. The only way to know whether your “anti-pattern” is actually a bad idea for your particular situation is…. to look at your particular situation! You can’t just say something is an anti-pattern without knowing what the person using that anti-pattern is trying to accomplish.

Let’s look at another datebasey example. It’s generally considered a bad idea to denormalize your database, every database design course will tell you so. Normalizing your database ensures all of your data is always right by isolating everything that can change independently. Since you don’t usually bother to keep data at all if you don’t care if it’s all accurate, you almost always want your data nicely normalized.

But sometimes correctness isn’t the absolute most important thing. Let’s take browser or mobile games, for instance. If your game takes too long to load, your players will just go do something else and they might not ever come back. It’s much more important to be fast than it is to be perfect if you want to make money on a game like that. And if something does go wrong and a player gets mad because their data didn’t get saved correctly, you can just give them game currency until they’re happy again.

In a situation where performance is more important than correctness, normalization is the anti-pattern. Like I said, it’s all about context!

To be clear, there are some things that are just a terrible idea no matter what you’re doing. I don’t care how clever you think you are, your variable names need to make sense. And no matter how well it performs, if your system is so confusing than no one can make updates, then it’s a bad system. But for the most part whether or not something is an anti-pattern is about what you’re trying to accomplish more than it’s about the “anti-pattern” itself.

Everybody likes code that makes sense to them, and everybody likes to feel like they’re doing things the right way, but rigidly following “best practices” just isn’t good enough. If it doesn’t help you achieve your greater goals for your application (ie not just gold-plating your code, but providing value to your users or customers), it’s just not a good idea no matter how many experts call it a best practice.

And yes, that means you can’t know if your code is good or bad without knowing exactly what you’re trying to do. Welcome to professional programming!