▲ | bbkane 5 days ago | |
"optimized for change" really only works well if you can predict the incoming changes. Common tools used for this "optimization" often raise the complexity and lower the performance of the system. For example, a db with a single table with just a key and a value is very flexible and "optimized for change" but it offers lower performance (in most cases) and is harder to reason about. I also frequently see people (me too) prematurely make abstractions (interfaces, extra tables, etc) because they're "optimizing for change". Then that part never changes OR it changes in a way that their abstraction doesn't abstract over OR they figure out a better abstraction later on when the app has matured a bit. Then that part of the code is at best wasted space (usually it needs to be rewritten yet no one gets time to do that). Of course, it's also foolish to say "never abstract". I almost always find it worth it to abstract over I/O, just so I can easily add logging, dual writes, or mock it. And when a change is obviously coming down the line it makes sense to plan for it. But usually I'm served best by trying to keep most of my computation pure functions (easy to test), doing as little as possible in the I/O path (it should just persist or print stuff so I can mock it) and otherwise write obvious "deletable" code that does one thing so I can debug it and, only if necessary, replace with a better abstraction if I need to. | ||
▲ | ninetyninenine 5 days ago | parent [-] | |
>"optimized for change" really only works well if you can predict the incoming changes. functional programming is the paradigm most optimized for modularity and therefore change. It's the best we have but it's limited in scope. |