Comment by hetman

Comment by hetman a day ago

0 replies

I agree with this sentiment. I find attempts to create these kinds of universal rules are often a result of the programmer doing a specific and consistently repeating type of data transformation/processing. In their context it often makes a lot of sense... but try and apply the rules to a different context and you might end up with a mess. It can also often result in a reactionary type of coding where we eliminate a bad coding pattern by taking such an extremely opposite position that the code becomes just as unreadable for totally different reasons.

This is not to say we shouldn't be having conversations about good practices, but it's really important to also understand and talk about the context that makes them good. Those who have read The Innovator's Solution would be familiar with a parallel concept. The author introduces the topic by suggesting that humanity achieved powered flight not by blindly replicating the wing of the bird (and we know how many such attempts failed because it tried to apply a good idea to the wrong context) but by understanding the underlying principle and how it manifests within a given context.

The recommendations in the article smell a bit of premature optimisation if applied universally, though I can think of context in which they can be excellent advice. In other contexts it can add a lot of redundancy and be error prone when refactoring, all for little gain.

Fundamentally, clear programming is often about abstracting code into "human brain sized" pieces. What I mean by that is that it's worth understanding how the brain is optimised, how it sees the world. For example, human short term memory can hold about 7±2 objects at once so write code that takes advantage of that, maintaining a balance without going to extremes. Holy wars, for example, about whether OO or functional style is always better often miss the point that everything can have its placed depending on the constraints.