Remix.run Logo
wheybags 3 days ago

This is a sentiment that I've seen expressed in comment sections many times. I've been programming professionally now for 10 years, and it just doesn't resonate with my experience. Problems with build systems for external dependencies, package managers, and underfeatured / overcomplicated / buggy third party dependencies have been by far the worse issue in my career, compared to problems with homebrewed systems.

I'm not saying you're wrong, I don't doubt that many people have the opposite experience. It just makes me feel a bit alien when I read comments like this.

MobiusHorizons 2 days ago | parent | next [-]

Thanks for saying this, I feel this way all the time even though I know it’s against the prevailing wisdom.

My experience is that in the pursuit of not reinventing the wheel, I am frequently told to use a dependency that doesn’t allow us to solve the whole problem, or prevents us from making making the user experience fast or cannot be made to understand our data model. It’s all well and good to use a tool that exists, but using the wrong tool just because it exists is madness. Even worse is when dependencies are deprecated or our use cases become unsupported. Honestly I would prefer to just build everything above the database layer in house, that way we at least know what we can and can’t deliver, and have some chance of fixing things when they break.

PartiallyTyped 2 days ago | parent | next [-]

I am practically having this conversation at work. There's a sister team with a great tool for benchmarking what they are working on, but it is not convenient for our needs, and I am told to "just do the plumbing to make it work for our needs". Reality is that there are far, far easier ways to achieve what we need than doing all that plumbing, adding more layers of abstraction on top of what is a side project of an adjacent team.

anal_reactor 2 days ago | parent | prev | next [-]

The problem with being smarter than average programmer is that your insights will rarely ever be considered, even if they're correct, because they're new and controversial. That's because, from the perspective of an average programmer, a bad programmer who doesn't know what they're doing, and a programmer using techniques so advanced that they cannot be understood, are effectively indistinguishable, which means that average team will treat both geniuses and morons in the same way.

I feel like the collapse of the tech bro coincided with the masses going to programming, which changed the culture from promoting innovation and development, into simply following whatever best practices someone had already written, turning programming from a creative job into yet another repetitive office job. This is also, in my opinion, the true reason why salaries collapsed. Most business don't need creative specialists, they need code monkeys, and most people aren't creative specialists, they're code monkeys. So why would the salary worthy of a creative specialist even be talked about over here?

Viliam1234 a day ago | parent | next [-]

There is a legitimate concern that if the smarter programmer quits the job, the remaining average programmers will not be able to maintain the code.

I think a smart solution would be to teach the average programmers the new concepts. Many of them would probably be happy to learn, and the company would benefit from having everyone know a bit more and use better solutions. But for some reason, this usually doesn't happen.

anal_reactor a day ago | parent [-]

>Many of them would probably be happy to learn

No. Most people don't like being told that they're wrong, drama ensues. They say they want to learn, but in reality, they don't.

throwaway2037 2 days ago | parent | prev [-]

To summarise your first paragraph: This programming meme? https://pbs.twimg.com/media/FiMbeF_XoAAYAQb.jpg

And your second paragraph sounds like sour grapes. I have no idea what "the collapse of the tech bro coincided with..." means. Most programmers are working on CRUD apps. How creative do you need to be?

solatic 2 days ago | parent | prev [-]

When dependencies don't deliver 100% of what you want, you should be able to work with upstream to get what you want added.

If upstream won't cooperate with you, then fork. It's still usually better to start from a battle-hardened codebase than it is from complete scratch.

dylan604 3 days ago | parent | prev | next [-]

Not targeting you, but the industry in general. In every other industry I've been in outside of software dev, 10 years is not considered elder. You're just now becoming not a greenhorn. You're just now getting your sea legs. It's amazing what additional experience happens after year 10.

To that effect, Rust (2015) is 9 years old, Go and Node are 15 years old. While Python (1991) is 33 years old. Just putting things in a different perspective

shermantanktop 2 days ago | parent | next [-]

I’ve been in this game for 30 and I agree with GP. “I won’t build that simple thing from scratch, I’ll just import this thing that does approximately what I want.”

We should banish the word “import” in favor of “take a dependency on someone else’s code, including the stability of the API, the support model, willingness to take patches, testing philosophy…”

Reputation is a rough proxy; inspecting the code can help. But when the thing you built your house of cards on falls over, you often can’t fix your house, and have to build a new house.

Obviously this applies more to utility code than it does to entire languages. But even there, Apple has broken their Swift syntax enough to release tools that upgrade your code for you…and that’s the best case scenario.

hnlmorg 2 days ago | parent | prev | next [-]

I’ve been in the industry for > 20 years and if anything, I think most people are too scared or lazy to reinvent code.

I’m not suggesting the earlier argument about NIH (not invented here) syndrome doesn’t exist. But I’ve certainly never seen in the scale that the earlier posted claimed. If anything, I see people getting less inclined to reinvent things because there’s so much code already out there.

But maybe this is a domain specific problem? There does seem to be a new JavaScript frontend framework released every week.

YZF 2 days ago | parent [-]

I've been in the industry for >30 years ;)

I'm not sure what's the proposal?

- Don't use an OS. Write your own. Linux? Boring.

- Design your own CPU.

- ext3 or xfs? Nah write your own.

- Write your own database.

- Ethernet. Too boring. Create a new physical layer for networking.

- Naturally create your own programming language. That'll make it much easier to find people when you have to expand the team.

Seriously, build vs. buy and NIH has always been with us. There's a time to build and there's a time to buy/reuse. Where are you adding value? What's good enough for your project? What's strategic vs. tactical? How easy is it to change your mind down the road? What are the technical capabilities of the team? How do different options impact your schedule/costs? How do they impact quality? In the short term? In the medium term? In the long term.

ATMLOTTOBEER 2 days ago | parent | next [-]

I’ve been in software for over 40 years (yes I’m that old ), and in my humble opinion it’s always correct to build. It keeps things fresh.

YZF 2 days ago | parent [-]

The reality is there is no way to build everything. You want to do scientific computing do you use libraries that have been optimized for 50 years or do you write your own? You want to do cryptography do you build your own? Pretty much everyone working on LLMs today is leveraging things like nccl, cuda, pyTorch, job scheduling frameworks.

Let's face it. Nobody builds everything from scratch. The closest is companies like Google who due to sheer scale benefit from building everything from hardware to languages and even for them it's not always clear whether that was the right thing for the business or something they could afford to do because they had lots of money.

Build the things that add value. Don't build something that just works. That's why we have the old saying don't reinvent the wheel. If you have a working wheel, while re-inventing it might be fun, it's usually not the best use of time. In the time you've saved build cool things that add value.

tehjoker 2 days ago | parent [-]

gotta say, having written some scientific computing code, the libraries out there do not always cover the exact operation you need and are not always using the best algo in the literature. i was able to beat the existing ecosystem 6x head to head and thousands of times faster for my use case. ymmv ofc depending on the problem.

that said, it was not easy!!

YZF a day ago | parent [-]

I worked on some proprietary video/image encoding application. In that context we hand wrote things like colour space conversions, wavelet transforms, arithmetic coders, compression algorithms, even hashing functions, in SIMD and we got better performance than anything off the shelf. We still used some off the shelf code where we could (e.g. Intel's hand written libraries). The thing is that this was the core of our business and our success depended on how performant these pieces were. That was also some time back, maybe today the situation is different. In this sort of situation you should absolutely put in the effort. But that typically accounts for some small % of the overall software you're going to be a user of. This is really just another variation of the premature optimization statement: "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%". So if you're in the 3% then by all means go for it (you gotta). But if you're in the 97% it's silly to not use what's out there (other than for fun, learning etc.)

spookie 2 days ago | parent | prev | next [-]

Let's be honest, nobody is saying to rebuild the world from scratch.

The stance for in-house built tools and software is a much more balanced act than that. One that prioritises self-reliance, and foments institutional knowledge while assessing the risks of making that one more thing in-house. It promotes a culture where employees stay, because they know they might be able to create great impact. It also has the potential to cut down the fat of a lot of money being spent on third parties.

Let's be real, most companies have built Empire State Buildings out of cards. Their devs spend most of their time fixing obtuse problems they don't understand, and I'm not talking about programming, but in their build processes and dependency hell.

It's no wonder that the giants of today, who have survived multiple crisis, are the ones who took the risk of listening to those "novice" enthusiastic engineers.

Don't kill the enthusiasm, tame it.

YZF 2 days ago | parent [-]

Sure. We should harness enthusiasm and channel it in the right direction.

I'm not sure I agree the giants of today are built on the work of enthusiastic novices. Amazon and Microsoft have always had a ton of senior talent. Meta started with novices but then a lot got reworked by more experienced people.

You might get by with sheer enthusiasm and no experience but often that leads you to failure.

spookie a day ago | parent [-]

Fair.

ConspiracyFact 2 days ago | parent | prev | next [-]

This is a bit disingenuous, don't you think? There's technically a spectrum from low-level to high-level code, but in practice it's not too difficult to set a limit on how far down the stack you're willing to go while building. Writing a new testing framework is qualitatively different from writing a new OS or filesystem, and you know it just as well as everyone else does.

YZF 2 days ago | parent [-]

I was trying to make a point... Apparently not very well ;)

But let's take on the testing framework question. When I work in Go or in Python I use the testing frameworks that are part of that ecosystem. When I worked with C++ I'd use the Boost testing framework. Or Google's open source testing framework. Engineers I work with that do front-end development use Playwright (and I'm sure some other existing framework for unit tests).

Can't you do your own thing? sure. You'd be solving a lot of problems that other people have already solved. Over time the thing you did on your own will require more work. You need to weigh all of that vs. using something off the shelf. 9 times out of 10, most people, should use tooling that exists and focus on the actual product they want to build.

That said I work for a large company where we build a lot of additional in-house tooling. It makes sense because: a) it's a specialized domain - there's nothing off the shelf that addresses our specific needs. b) we are very large and so building custom tools is an investment that can be justified over the entire engineering team. We still use a ton of stuff off the shelf.

I don't see what the parent was saying. I think most of the time people choose to use existing bits for good reasons. When I started my career you pretty much had to do most stuff yourself. These days you almost always can find a good off the shelf option for most "standard" things you're trying to do. If you want to write your own testing framework for fun, go for it. If you're trying to get something else done (for business or other reasons) it's not something that usually makes sense. That said it's not like we have a shortage of people trying to do new things or revisit old things, for fun or profit. We have more than ever (simply because we have more than ever people doing software in general).

yellowapple 2 days ago | parent | prev [-]

"If you wish to make an apple pie from scratch, you must first invent the universe."

Cthulhu_ 3 days ago | parent | prev | next [-]

It just feels different in software development because things have moved very fast, I'd say especially when github rose to prominence. The amount of software developers on the market has also increased exponentially since then, so the amount of (relatively) junior developers is much higher than those of 15, 20+ years of experience.

jltsiren 3 days ago | parent [-]

The number of software developers has maybe doubled in the last 20 years. The number of senior developers has "always" been low, because the field suffers from unusually high attrition. Many people find that software is not for them, many switch fields after losing their jobs in an economic downturn, some move to management, and some make too much money to bother continuing until retirement age.

samatman 2 days ago | parent | next [-]

I'm reasonably sure that this estimate is far off the mark. The numbers I've seen suggest that the number of new software developers entering the industry has doubled every five years since at least the mid 90s. That's not the same metric as total number of developers, but it may as well be, and it definitely doesn't add up to a mere doubling of the total in twenty years.

mrkeen 3 days ago | parent | prev [-]

Has there actually been attrition? Exponential growth is enough to explain "many more juniors than seniors" at any time in the past, present or future.

Also for attrition to be the cause, you'd need a lot more seniors dropping off than juniors.

Retric 3 days ago | parent | next [-]

None of my friends who graduated with me are still software developers and I’m several years from retirement age.

There’s a bunch of filters. Many people quickly realize they don’t enjoy development, next is openings in management. One of the big ones is at ~40 you’re debt free, have a sizable nest egg, and start thinking of you really want to do this for the next 20 years?

A part of this is the job just keeps getting easier over time. Good developers like a challenge, but realize that the best code is boring. Tooling is just more robust when you’re doing exactly the same things as everyone else using it, and people can more easily debug and maintain straightforward code. So a project that might seem crazy difficult at 30 starts to just feel like a slog through well worn ground.

Having significant experience in something also becomes a trap as you get rewarded for staying in that bubble until eventually the industry moves on to something else.

lifeisstillgood 2 days ago | parent | next [-]

I recently hit thirty years of professional software development, in companies large small, profit and non profit, proprietary and FOSS, I have led teams of forty, sat in a corner as the only developer and one thing I know in my bones - I love making software and money just means I get to code what I want instead of what The Man wants.

In fact I already have my retirement planned - a small near flat in Argostolli, a walk down to the coffee bars on the harbour and a few hours adding code and documentation to a foss project of my choice before heading to the beach with grandkids.

Now affording retirement might be interesting but not having coding in it will be like not having reading and writing

oblio 2 days ago | parent | prev [-]

You're probably from a privileged environment such as working in the US (probably in a top location) and probably from a top university or you were there at the right time to join a top company as it grew rapidly.

The first paragraph probably applies to 1-10% of developers worldwide...

Retric 2 days ago | parent | next [-]

The only part of that that applies to my friends is living in the US. Programming pays well just about anywhere for that area even if the absolute numbers are less extreme.

I also don’t mean early retirement. Still, combine minimal schooling, high demand, reasonable pay, and the basic financial literacy of working with complex systems adds up over time.

2 days ago | parent | prev | next [-]
[deleted]
throwaway2037 2 days ago | parent | prev [-]

Yeah, there is no way that the majority of EU-based developers can retire at 45 on their meagre salaries.

jltsiren 3 days ago | parent | prev [-]

The exponential growth has been something like 3-4%/year, or 2x in 20 years. Though it's hard to find useful statistics that take different job titles and changing nature of the industry properly into account.

If you had asked me in 2010, I would have said that the median software developer lasts 5-10 years in the industry. A lot of people left the field after the dot-com bubble burst. The same happened again in a smaller scale in the late 2000s, at least in countries where the financial crisis was a real-world event (and not just something you heard about in the news). But now there has been ~15 years of sustained growth, and the ratio of mid-career developers to juniors may even be higher than usual.

WalterBright 2 days ago | parent | prev | next [-]

It often takes 10 years or more of use before you discover that your technique is execrable.

For example, most use of macros, especially when you invent your own language implemented as macros.

WalterBright 2 days ago | parent [-]

A lot of people disagree with me on that. Wait till they try to get other people to understand their code :-/

Me, I've eliminated nearly all the use of macros from my old C code that is still around. The code is better.

I suspect that heavy use of macros is why Lisp has never become mainstream.

kazinator 2 days ago | parent [-]

Heavy use of macros could be why C went mainstream.

Macros gave C efficient inline functions without anything having to be done in the compiler.

Doing things like "#define velocity(p) (p)->velocity" would instantly give a rudimentary C compiler with no inline functions a performance advantage over a crappy Pascal or Modula compiler with no inline functions, while keeping the code abstract.

And of course #if and #ifdef greatly help with situations where C does not live up to its poorly deserved portability reputation. In languages without #ifdef, you would have to clone an entire source file and write it differently for another platform, which would cause a proliferation due to minor differences (e.g. among Unixes).

Ah, speaking of which; C's #ifdef allowed everyone to have their own incompatible flavor of Unix with its own different API's and header files, yet get the same programs working.

An operating system based on a language without preprocessing would have hopelessly fragmented if treated the same way, or else stagnated due to discouraging local development.

Thanks in part to macros, Lisp people were similarly able to use each other code (or at least ideas) in spite of working on different dialects at different sites.

WalterBright 2 days ago | parent [-]

You're quite right in that early C was a primitive compiler, and adding a macro processor was a cheap and easy way to add power.

Using the macro preprocessor to work around some fundamental issues with the language is not what I meant.

I meant devising one's own language using macros. The canonical example:

    #define BEGIN {
    #define END }
We laugh about that today, but the 80's people actually did that. Today's macros are often just more complicated attempts at the same thing.

The tales I hear about Lisp is that a team's code is not portable to another team, because they each invent their own macro language in order to be able to use Lisp at all.

lor_louis a day ago | parent | next [-]

To be fair, I'd rather type BEGIN instead of <<? Or whatever the trigraph is supposed to be. We tend to forget that a lot of computers didn't have the keys to type "mathematical" symbols.

WalterBright a day ago | parent [-]

EBCDIC was already dead in the 1980s. Nobody ever used the trigraphs except for one company that hung on for years until even the C++ community decided enough was enough and dumped them.

Besides, what people wrote was:

    #define BEGIN {
not:

    #define BEGIN <<?
kazinator a day ago | parent | prev [-]

Stephen Bourne used such macros in the Bourne Shell sources to make the code resemble Algol.

The source is very clear and readable.

WalterBright a day ago | parent [-]

Have you adopted his macros in your own projects?

kazinator a day ago | parent [-]

No because even if I could identify a benefit to these macros (which I can't in the contexts in which I work) there's a cost to using them.

Macros whuch simply transliterate tokens to other tokens without performing a code transformation do not have a compelling technical benefit. Only a non-technical benefit to a peculiar minority of users.

In terms of cost, the readability and writeability are fine. What's not fine is that the macros will confuse tooling which processes C code without necessarily expanding it through the preprocessor. Tooling like text editing modes,identifier cross-referencers and whatnot.

I've used C macros to extend a language with constructs like exception handling. These have a syntax that harmonizes with the language, making then compatible with all the tooling I use.

There's a benefit because the macro expansions are too verbose and detailed to correctly repeat by hand, not to mention to correctly update if the implementation is adjusted.

jrk 3 days ago | parent | prev | next [-]

Rust was started in 2006 and launched publicly, I believe, in 2009, the same year as Go. The point stands that these are still fairly new, but it’s not nearly that new.

ChrisSD 3 days ago | parent | next [-]

Rust 1.0 was released in 2015 making it almost ten years old.

Rust, unlike Go, was largely developed in public. It also changed significantly between it's initial design and 1.0 so it feels like "cheating" to count pre-release versions.

Still, a decade is a significant milestone.

ternaryoperator 2 days ago | parent [-]

That's right. One of the knocks on those early versions was that every new release broke previous code in significant ways. Which is one reason that v. 1.0 was so important to the community. They could finally commit code using a stable language.

cmrdporcupine 3 days ago | parent | prev [-]

Early Rust was a very different beast.

But could say the same about Python pre-1995 or so.

My biggest problems with Rust, though, are Cargo and Crates.io, not the language.

dhosek 2 days ago | parent | next [-]

Weird, cargo and crates.io is why I ended up deciding on Rust for developing finl rather than C++. The lack of standardized build/dependency management in C++ was a major pain point.

anacrolix 3 days ago | parent | prev | next [-]

crates and cargo are better than Rust actual

cmrdporcupine 2 days ago | parent [-]

... for you

gary_0 2 days ago | parent | prev [-]

Cargo and crates.io are something C/C++ developers would kill for (which is why cargo is what it is, I think).

cmrdporcupine 2 days ago | parent [-]

Just because there is an absolute shitshow for C/C++ build systems doesn't automatically make Cargo & Crates.io good.

There is a fundamental philosophical disagreement I have with the NPM style of package management and this method of handling dependencies. Like NPM, Crates.io is a chaotic wasteland, destined for a world of security & license problems and transitive dependency bloat.

But honestly I'm sick of having this out on this forum. You're welcome to your opinion. After 25 years of working, with various styles of build and dependency management: I have mine.

gary_0 2 days ago | parent [-]

I wasn't disagreeing with you. My comment was implying that cargo (and arguably rust itself to some extent) was kind of a knee-jerk response to the insane parts of C/C++, for better and also for worse.

cmrdporcupine 2 days ago | parent [-]

Ok that's fair, sorry to be defensive.

But I actually think it's more inspired by people coming from the NodeJS ecosystem than people coming from C++.

roenxi 2 days ago | parent | prev [-]

Well "elders" are the people who have been there for the most amount of time, so if the industry has >30 year veterans wandering around then the elders will have around that much experience. But the learning in an industry is generally logarithmic where most of the lessons get picked up in the first 1-3 years and then after that there are only occasional new things to pick up.

If anything software seems to be unusually favourable to experience because the first 5 years of learning how to think like a computer is so punishing.

marcosdumay 3 days ago | parent | prev | next [-]

I've been there, on both sides, with homebrew ideas pushed from up and down, some that worked nicely, and some that were complete disasters...

And I agree with you. The problems with third party dependencies are way worse than any in-house complete disaster.

But that happens almost certainly because everybody is severely biased into adding dependencies. Make people biased into NIH again, and the homebrew systems will become the largest problems again.

eitland 3 days ago | parent | next [-]

In the two last projects I have worked on I have been lucky to work with great younger developers that neither invent things from scratch nor insist on pulling in exotic dependencies.

We have used mainstream technologies like just Quarkus or Spring Boot and plain React with Typescript and the absolute bare minimum of dependencies.

I have worked with a number of good devs over the years but it is amazing how productive these teams have been. (Should probably also mention that we were also lucky to have great non technical people on those teams.)

lukan 3 days ago | parent | prev [-]

"because everybody is severely biased into adding dependencies"

When I make a request to chatGPT to show me a example of something with javascript and node - it always brings me a solution with a external libary needed. So I have to add "without external dependencies" - then it presents me a nice and clean solution without all the garbage I don't need.

So apparently adding yet another libary seems normal for many people, otherwise this behavior would not replicate in an LLM.

sgarland 2 days ago | parent | next [-]

Same. My standing system prompt for Claude is “do not suggest any 3rd party libraries unless I ask for other options.”

Python perhaps isn’t quite as bad as JS in this regard, but people still have a tendency to pull in numpy for trivial problems that stdlib can easily solve, and requests to make a couple of simple HTTP calls.

fragmede 2 days ago | parent [-]

fascinating. What sort of web requests are you doing stuff are you doing that it's not just easier to use requests? I use requests pathologically, though I use things in the stdlib when it comes to it. personally I'm more surprised that it isn't in the stdlib by this point.

sgarland a day ago | parent [-]

It's not a matter of easier, it's that I'm against adding dependencies when they're not meaningfully adding value. It's not that hard to use urllib.request if you just need to pull down some file in a script, or fire a web hook, etc.

If you need connection pooling, keep-alive, or any of the other features that requests adds, then sure, add it.

OtomotO 3 days ago | parent | prev | next [-]

It simply depends on what you need.

I am gladly writing my own left-pad.

I am gladly using something like three.js if I need it.

The problem are the extreme stances. Don't add any dependency is as stupid as pulling in dependencies for every second line.

lukan 3 days ago | parent | next [-]

My point was, that in the js/node universe - the extreme stances seem to be the default already.

(Unless the LLM is heavily biased towards certain code, maybe because blog entries promoting their libary got too much weight, but looking at a random js source of a webpage - they do mostly ship tons of garbage).

wizzwizz4 3 days ago | parent | prev [-]

Please don't write your own left-pad. It's built into the standard library, under the name (String.prototype.)padStart.

shermantanktop 2 days ago | parent | next [-]

That example was not picked at random.

OtomotO 2 days ago | parent | prev [-]

Exemplum docet

wizzwizz4 2 days ago | parent [-]

I have seen this argument made many times, but none of the examples used illustrated it properly. Past a certain point, one becomes suspicious of the argument itself.

chipsrafferty 2 days ago | parent [-]

Have you ever wondered why padStart is part of the standard library?

You are unaware of a core part of JavaScript history, which is why you don't understand why "I'm not importing a library to do left pad" is not only a proper example, but THE BEST example.

String.padStart was added in 2017.

This happened in 2016 https://en.m.wikipedia.org/wiki/Npm_left-pad_incident

wizzwizz4 a day ago | parent [-]

The left-pad incident was a problem with the build toolchain, not a problem with using a dependency. String padding is one of those fiddly things that you have to spend a couple of minutes on, and write 4–5 tests for, lest you get an off-by-one error. It makes perfect sense to bring in a dependency for it, if it's not available in the standard library, just as I might bring in a dependency for backprop (15 lines: https://github.com/albertwujj/genprop/blob/master/backprop.p...). My personal style is to reimplement this, but that doesn't mean it's foolish or unjustified to bring in a dependency.

It is, however, almost never justified to bring in a dependency for something that's in the standard library. The correct solution for that, in JavaScript-for-the-web, is a shim. left-pad is not a suitable example.

A better example would be https://www.npmjs.com/package/ansi-red:

  /*!
   * ansi-red <https://github.com/jonschlinkert/ansi-red>
   *
   * Copyright (c) 2015, Jon Schlinkert.
   * Licensed under the MIT License.
   */
  
  'use strict';
  
  var wrap = require('ansi-wrap');
  
  module.exports = function red(message) {
    return wrap(31, 39, message);
  };
But while this makes a point, does it really make the original point? This ansi-red library has at least two footguns, right off the bat.
pca006132 3 days ago | parent | prev [-]

This depends on languages. For c/c++ without a package manager, people hesitate to add external dependencies (and especially when high performance is needed, e.g. game engines).

lukan 3 days ago | parent [-]

It surely does, but I think in most languages it is a bad idea, to add a external dependency, where it isn't needed. Like in the cases I mentioned, it was just standard stuff, already covered by the browser standard libaries.

So when I just need a simple connection to a web socket - I don't need that external baggage. But for high performance graphics, yeah, I won't write all the WebGL code by hand, but use Pixi (with the option of writing my own shader where needed).

eitally 3 days ago | parent | prev | next [-]

I think this depends a lot on whether you're already using high level languages and lots of external libraries vs doing lower level programming using something like C/C++. I managed a large dev team in a Microsoft shop and it would never have occurred to anyone to ever create their own compiler. Even the most experienced programmers would have just continued to brute force things atop .Net's compiler until it eventually "worked". The result, combined with esoteric and poorly understood business requirements, was fragile spaghetti code few could parse for bugs or updates, but it was still several layers above the compiler.

This attitude is by far the most common among "enterprise developers", and one of the big differences between people building things from preexisting building blocks vs -- as witnessed from my 8 years at Google later -- people who think they're smart enough to build everything from the ground up, and do so, using primitive blocks and custom compilers created by similarly hubristic engineers who came before them.

Ymmv, but this has been my experience over the past 25 years.

neonsunset 2 days ago | parent [-]

To be fair expression trees offer nice capability to write your mini-dsl, then map it to expressions and then compile it.

It’s just an uncommon attitude to most enterprise teams, it has less to do with the language and more to do with the part of the industry. I wish more teams knew the tools they already have at their disposal.

segfaltnh 2 days ago | parent | prev | next [-]

I've spent most of my career in the infrastructure space and I agree with this so much. These days prevailing wisdom is just to use 20 off the shelf open source components and spend your entire day debugging YAML integrations. I think we've lost our minds a bit because of this prevailing wisdom that building a simple wheel that does the 10% of this you actually need is somehow self-indulgence or negligent or both.

tazjin 3 days ago | parent | prev | next [-]

It's people trying to generalise some rule over the wrong thing. The right thing is that, in both directions, how the project goes is simply a skill question.

You have unskilled, sloppy developers? The homebrew project AND the third-party integration will turn out a mess.

jesse__ 3 days ago | parent | prev | next [-]

Strong agree here. I tend to try as hard as I can to write as much as I can in house so that when shit hits the fan, I have a great chance of being able to do something about it.

Shelling out to an AST parsing library that happens to be slow? Well, shit, that sucks. Guess your compilers just slow now.

cmrdporcupine 3 days ago | parent | prev | next [-]

The argument is not between NIH and external deps. The argument is over needless complexity and brittle unreliable bits (which can come through either channel) vs keeping things simple.

In my experience, younger developers will push both (in-house and external) directions at once, actually. Building out complex edifices with sharp corners over a maze of transitive dependencies that few understand.

It's the same thing: A fantasy that a framework will solve the problem, combined with a fantasy that they can develop said framework. It's an urge we all suffer from but some of us have learned the hard way to be careful about. (And others who are great at self-promotion have been rewarded for it by naive investors and managers.)

Finding simple solutions takes humility and time.

quantadev 3 days ago | parent | prev | next [-]

This kind of thing admittedly isn't as pervasive in the last decade as it was the two before, so if you've been a dev only since 2014 years you may not have seen it. The old people like me will get it tho.

rzwitserloot 2 days ago | parent | prev | next [-]

Yes. And one of those vaunted differences between 'senior' and 'medior' is knowing the difference.

Because I can confirm what you said: Both experiences are real.

Brewing it up yourself can blow up in your face.

Reaching for external deps that solve the problem can blow up in your face.

Knowing which choice to make is _tricky_ and is hard to confirm. It doesn't sit well with programmers; either solution will _work_ (you can't write a unit test that 'fails' if you made the wrong choice here), and even if you're willing to accept highly suspect, Goodhart's law-susceptible metrics such as LOC, you still can't get anywhere because it's trading off more code you have to write and maintain without help from a larger community against having fewer lines _in total_ as part of the system.

I do not know of any way to do it right other than to apply a ton of experience. And it's really hard to keep yourself honest. Even if you're willing to wait 5 years and then spend some time looking back, how do you really know?

Anybody with a bunch of experience has seen enough homebrew stuff asplode in their face to be able to paint a picture with how utterly badly that choice could go. If you chose the 'build it on external deps' route you can easily tell yourself you did it right by painting a terrible picture of how it would have gone if you made the other choice.

But the reverse is just as true.

I think I'm really good at it. But, writing about it here, I don't have any real basis to make that claim. I look around at other dev shops that make products of similar complexity and it feels like they need 10x to 100x more resources, have more downtime, and have far larger dev teams. But no doubt bias is creeping in there too, and no 2 software products are 100% comparable in this sense.

I naturally trust homebrewers more because they tend to understand complex technical things better. Someone who can just glue libraries together is lost when I ask them to fire up a debugger and figure out why some interaction is not working. A hopeless NIH sufferer needs to be 'supervised' and their choices about what to write needs to be questioned, but, that's doable with supervision. "Just git gud and be technically proficient" not so much. But then maybe that's bias too - that leads to a codebase that is easier navigated when you're familiar with debuggers and reading code to understand it. Reaching for third party deps a lot leads to a codebase that is easier navigated when you're familiar with docs and tutorials. These are self fulfilling prophecies.

vanitywords 2 days ago | parent [-]

Been in software 25 years and was board design for telecom before that.

Vaunted is a vanity word.

I get what you mean based upon experience and I still prefer custom systems that eschew layers of syntax sugar. EE was way harder than managing a code base. The syntax patterns are of a finite set of values.

I won’t re-roll encryption libs but there’s a lot of “tooling” packages that just add syntax sugar to parse and cart around that came about in prior eras of sneakerware software that are baked into to development habits that no longer make sense.

I for one am excited about using ML to streamline the code base that comprised my preferred Linux system yet still builds to the usual runtime system. There’s a lot of duplication in code that models can help remove and a few tools can help unpack into machine state. EE brain informs me there’s no “code” in a running system. Just electricity. There’s way too much syntax sugar in the software ecosystem that’s just for parsing/marshaling/transpiling between syntax sugars. Bleh. It’s a big dumb monolith of glyph art that needs to be whacked back like a prairie that needs a controlled fire.

kaba0 3 days ago | parent | prev | next [-]

Well, foreign projects communicating with each other is always ground for a mess, but this is not an either-or question.

Also, your mileage may vary based on the niche you are working on - in case of, say java, the initial setup of the build system may not be "fun", but it will just work from then on.

RangerScience 3 days ago | parent | prev [-]

I've seen both, although rarely for either.

The worse trash fires were the homebrewed systems, but maybe that's because I could dig in and see how bad they were.

But I'd actually agree with you - as bad as those were, I'd rather them than a shitty 3rd party something. At least I can theoretically do something about the in-house one, and, all the ones I've seen were smaller in scope than any SaaS product.