Remix.run Logo
patwolf 4 hours ago

These rules apply equally well to system architecture. I've been trying to talk our team out of premature optimization (redis cluster) and fancy algorithms (bloom filters) to compensate for poor data structures (database schema) before we know if performance is going to be a problem.

Even knowing with 100% certainty that performance will be subpar, requirements change often enough that it's often not worth the cost of adding architectural complexity too early.

bob1029 4 hours ago | parent [-]

> Even knowing with 100% certainty that performance will be subpar

I think there is value in attempting to do something the "wrong way" on purpose to some extent. I have walked into many situations where I was beyond convinced that the performance of something would suck only to be corrected harshly by the realities of modern computer systems.

Framing things as "yes, I know the performance is definitely not ideal in this iteration" puts that monkey in a proper cage until the next time around. If you don't frame it this way up front, you might be constantly baited into chasing the performance monkey around. Its taunts can be really difficult to ignore.