▲ | rzwitserloot 2 days ago | |
Yes. And one of those vaunted differences between 'senior' and 'medior' is knowing the difference. Because I can confirm what you said: Both experiences are real. Brewing it up yourself can blow up in your face. Reaching for external deps that solve the problem can blow up in your face. Knowing which choice to make is _tricky_ and is hard to confirm. It doesn't sit well with programmers; either solution will _work_ (you can't write a unit test that 'fails' if you made the wrong choice here), and even if you're willing to accept highly suspect, Goodhart's law-susceptible metrics such as LOC, you still can't get anywhere because it's trading off more code you have to write and maintain without help from a larger community against having fewer lines _in total_ as part of the system. I do not know of any way to do it right other than to apply a ton of experience. And it's really hard to keep yourself honest. Even if you're willing to wait 5 years and then spend some time looking back, how do you really know? Anybody with a bunch of experience has seen enough homebrew stuff asplode in their face to be able to paint a picture with how utterly badly that choice could go. If you chose the 'build it on external deps' route you can easily tell yourself you did it right by painting a terrible picture of how it would have gone if you made the other choice. But the reverse is just as true. I think I'm really good at it. But, writing about it here, I don't have any real basis to make that claim. I look around at other dev shops that make products of similar complexity and it feels like they need 10x to 100x more resources, have more downtime, and have far larger dev teams. But no doubt bias is creeping in there too, and no 2 software products are 100% comparable in this sense. I naturally trust homebrewers more because they tend to understand complex technical things better. Someone who can just glue libraries together is lost when I ask them to fire up a debugger and figure out why some interaction is not working. A hopeless NIH sufferer needs to be 'supervised' and their choices about what to write needs to be questioned, but, that's doable with supervision. "Just git gud and be technically proficient" not so much. But then maybe that's bias too - that leads to a codebase that is easier navigated when you're familiar with debuggers and reading code to understand it. Reaching for third party deps a lot leads to a codebase that is easier navigated when you're familiar with docs and tutorials. These are self fulfilling prophecies. | ||
▲ | vanitywords 2 days ago | parent [-] | |
Been in software 25 years and was board design for telecom before that. Vaunted is a vanity word. I get what you mean based upon experience and I still prefer custom systems that eschew layers of syntax sugar. EE was way harder than managing a code base. The syntax patterns are of a finite set of values. I won’t re-roll encryption libs but there’s a lot of “tooling” packages that just add syntax sugar to parse and cart around that came about in prior eras of sneakerware software that are baked into to development habits that no longer make sense. I for one am excited about using ML to streamline the code base that comprised my preferred Linux system yet still builds to the usual runtime system. There’s a lot of duplication in code that models can help remove and a few tools can help unpack into machine state. EE brain informs me there’s no “code” in a running system. Just electricity. There’s way too much syntax sugar in the software ecosystem that’s just for parsing/marshaling/transpiling between syntax sugars. Bleh. It’s a big dumb monolith of glyph art that needs to be whacked back like a prairie that needs a controlled fire. |