| ▲ | dylan604 3 days ago |
| Not targeting you, but the industry in general. In every other industry I've been in outside of software dev, 10 years is not considered elder. You're just now becoming not a greenhorn. You're just now getting your sea legs. It's amazing what additional experience happens after year 10. To that effect, Rust (2015) is 9 years old, Go and Node are 15 years old. While Python (1991) is 33 years old. Just putting things in a different perspective |
|
| ▲ | shermantanktop 2 days ago | parent | next [-] |
| I’ve been in this game for 30 and I agree with GP. “I won’t build that simple thing from scratch, I’ll just import this thing that does approximately what I want.” We should banish the word “import” in favor of “take a dependency on someone else’s code, including the stability of the API, the support model, willingness to take patches, testing philosophy…” Reputation is a rough proxy; inspecting the code can help. But when the thing you built your house of cards on falls over, you often can’t fix your house, and have to build a new house. Obviously this applies more to utility code than it does to entire languages. But even there, Apple has broken their Swift syntax enough to release tools that upgrade your code for you…and that’s the best case scenario. |
|
| ▲ | hnlmorg 2 days ago | parent | prev | next [-] |
| I’ve been in the industry for > 20 years and if anything, I think most people are too scared or lazy to reinvent code. I’m not suggesting the earlier argument about NIH (not invented here) syndrome doesn’t exist. But I’ve certainly never seen in the scale that the earlier posted claimed. If anything, I see people getting less inclined to reinvent things because there’s so much code already out there. But maybe this is a domain specific problem? There does seem to be a new JavaScript frontend framework released every week. |
| |
| ▲ | YZF 2 days ago | parent [-] | | I've been in the industry for >30 years ;) I'm not sure what's the proposal? - Don't use an OS. Write your own. Linux? Boring. - Design your own CPU. - ext3 or xfs? Nah write your own. - Write your own database. - Ethernet. Too boring. Create a new physical layer for networking. - Naturally create your own programming language. That'll make it much easier to find people when you have to expand the team. Seriously, build vs. buy and NIH has always been with us. There's a time to build and there's a time to buy/reuse. Where are you adding value? What's good enough for your project? What's strategic vs. tactical? How easy is it to change your mind down the road? What are the technical capabilities of the team? How do different options impact your schedule/costs? How do they impact quality? In the short term? In the medium term? In the long term. | | |
| ▲ | ATMLOTTOBEER 2 days ago | parent | next [-] | | I’ve been in software for over 40 years (yes I’m that old ), and in my humble opinion it’s always correct to build. It keeps things fresh. | | |
| ▲ | YZF 2 days ago | parent [-] | | The reality is there is no way to build everything. You want to do scientific computing do you use libraries that have been optimized for 50 years or do you write your own? You want to do cryptography do you build your own? Pretty much everyone working on LLMs today is leveraging things like nccl, cuda, pyTorch, job scheduling frameworks. Let's face it. Nobody builds everything from scratch. The closest is companies like Google who due to sheer scale benefit from building everything from hardware to languages and even for them it's not always clear whether that was the right thing for the business or something they could afford to do because they had lots of money. Build the things that add value. Don't build something that just works. That's why we have the old saying don't reinvent the wheel. If you have a working wheel, while re-inventing it might be fun, it's usually not the best use of time. In the time you've saved build cool things that add value. | | |
| ▲ | tehjoker 2 days ago | parent [-] | | gotta say, having written some scientific computing code, the libraries out there do not always cover the exact operation you need and are not always using the best algo in the literature. i was able to beat the existing ecosystem 6x head to head and thousands of times faster for my use case. ymmv ofc depending on the problem. that said, it was not easy!! | | |
| ▲ | YZF a day ago | parent [-] | | I worked on some proprietary video/image encoding application. In that context we hand wrote things like colour space conversions, wavelet transforms, arithmetic coders, compression algorithms, even hashing functions, in SIMD and we got better performance than anything off the shelf. We still used some off the shelf code where we could (e.g. Intel's hand written libraries). The thing is that this was the core of our business and our success depended on how performant these pieces were. That was also some time back, maybe today the situation is different. In this sort of situation you should absolutely put in the effort. But that typically accounts for some small % of the overall software you're going to be a user of. This is really just another variation of the premature optimization statement: "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%". So if you're in the 3% then by all means go for it (you gotta). But if you're in the 97% it's silly to not use what's out there (other than for fun, learning etc.) |
|
|
| |
| ▲ | spookie 2 days ago | parent | prev | next [-] | | Let's be honest, nobody is saying to rebuild the world from scratch. The stance for in-house built tools and software is a much more balanced act than that. One that prioritises self-reliance, and foments institutional knowledge while assessing the risks of making that one more thing in-house. It promotes a culture where employees stay, because they know they might be able to create great impact. It also has the potential to cut down the fat of a lot of money being spent on third parties. Let's be real, most companies have built Empire State Buildings out of cards. Their devs spend most of their time fixing obtuse problems they don't understand, and I'm not talking about programming, but in their build processes and dependency hell. It's no wonder that the giants of today, who have survived multiple crisis, are the ones who took the risk of listening to those "novice" enthusiastic engineers. Don't kill the enthusiasm, tame it. | | |
| ▲ | YZF 2 days ago | parent [-] | | Sure. We should harness enthusiasm and channel it in the right direction. I'm not sure I agree the giants of today are built on the work of enthusiastic novices. Amazon and Microsoft have always had a ton of senior talent. Meta started with novices but then a lot got reworked by more experienced people. You might get by with sheer enthusiasm and no experience but often that leads you to failure. | | |
| |
| ▲ | ConspiracyFact 2 days ago | parent | prev | next [-] | | This is a bit disingenuous, don't you think? There's technically a spectrum from low-level to high-level code, but in practice it's not too difficult to set a limit on how far down the stack you're willing to go while building. Writing a new testing framework is qualitatively different from writing a new OS or filesystem, and you know it just as well as everyone else does. | | |
| ▲ | YZF 2 days ago | parent [-] | | I was trying to make a point... Apparently not very well ;) But let's take on the testing framework question. When I work in Go or in Python I use the testing frameworks that are part of that ecosystem. When I worked with C++ I'd use the Boost testing framework. Or Google's open source testing framework. Engineers I work with that do front-end development use Playwright (and I'm sure some other existing framework for unit tests). Can't you do your own thing? sure. You'd be solving a lot of problems that other people have already solved. Over time the thing you did on your own will require more work. You need to weigh all of that vs. using something off the shelf. 9 times out of 10, most people, should use tooling that exists and focus on the actual product they want to build. That said I work for a large company where we build a lot of additional in-house tooling. It makes sense because: a) it's a specialized domain - there's nothing off the shelf that addresses our specific needs. b) we are very large and so building custom tools is an investment that can be justified over the entire engineering team. We still use a ton of stuff off the shelf. I don't see what the parent was saying. I think most of the time people choose to use existing bits for good reasons. When I started my career you pretty much had to do most stuff yourself. These days you almost always can find a good off the shelf option for most "standard" things you're trying to do. If you want to write your own testing framework for fun, go for it. If you're trying to get something else done (for business or other reasons) it's not something that usually makes sense. That said it's not like we have a shortage of people trying to do new things or revisit old things, for fun or profit. We have more than ever (simply because we have more than ever people doing software in general). |
| |
| ▲ | yellowapple 2 days ago | parent | prev [-] | | "If you wish to make an apple pie from scratch, you must first invent the universe." |
|
|
|
| ▲ | Cthulhu_ 3 days ago | parent | prev | next [-] |
| It just feels different in software development because things have moved very fast, I'd say especially when github rose to prominence. The amount of software developers on the market has also increased exponentially since then, so the amount of (relatively) junior developers is much higher than those of 15, 20+ years of experience. |
| |
| ▲ | jltsiren 3 days ago | parent [-] | | The number of software developers has maybe doubled in the last 20 years. The number of senior developers has "always" been low, because the field suffers from unusually high attrition. Many people find that software is not for them, many switch fields after losing their jobs in an economic downturn, some move to management, and some make too much money to bother continuing until retirement age. | | |
| ▲ | samatman 2 days ago | parent | next [-] | | I'm reasonably sure that this estimate is far off the mark. The numbers I've seen suggest that the number of new software developers entering the industry has doubled every five years since at least the mid 90s. That's not the same metric as total number of developers, but it may as well be, and it definitely doesn't add up to a mere doubling of the total in twenty years. | |
| ▲ | mrkeen 3 days ago | parent | prev [-] | | Has there actually been attrition? Exponential growth is enough to explain "many more juniors than seniors" at any time in the past, present or future. Also for attrition to be the cause, you'd need a lot more seniors dropping off than juniors. | | |
| ▲ | Retric 3 days ago | parent | next [-] | | None of my friends who graduated with me are still software developers and I’m several years from retirement age. There’s a bunch of filters. Many people quickly realize they don’t enjoy development, next is openings in management. One of the big ones is at ~40 you’re debt free, have a sizable nest egg, and start thinking of you really want to do this for the next 20 years? A part of this is the job just keeps getting easier over time. Good developers like a challenge, but realize that the best code is boring. Tooling is just more robust when you’re doing exactly the same things as everyone else using it, and people can more easily debug and maintain straightforward code. So a project that might seem crazy difficult at 30 starts to just feel like a slog through well worn ground. Having significant experience in something also becomes a trap as you get rewarded for staying in that bubble until eventually the industry moves on to something else. | | |
| ▲ | lifeisstillgood 2 days ago | parent | next [-] | | I recently hit thirty years of professional software development, in companies large small, profit and non profit, proprietary and FOSS, I have led teams of forty, sat in a corner as the only developer and one thing I know in my bones - I love making software and money just means I get to code what I want instead of what The Man wants. In fact I already have my retirement planned - a small near flat in Argostolli, a walk down to the coffee bars on the harbour and a few hours adding code and documentation to a foss project of my choice before heading to the beach with grandkids. Now affording retirement might be interesting but not having coding in it will be like not having reading and writing | |
| ▲ | oblio 2 days ago | parent | prev [-] | | You're probably from a privileged environment such as working in the US (probably in a top location) and probably from a top university or you were there at the right time to join a top company as it grew rapidly. The first paragraph probably applies to 1-10% of developers worldwide... | | |
| ▲ | Retric 2 days ago | parent | next [-] | | The only part of that that applies to my friends is living in the US. Programming pays well just about anywhere for that area even if the absolute numbers are less extreme. I also don’t mean early retirement. Still, combine minimal schooling, high demand, reasonable pay, and the basic financial literacy of working with complex systems adds up over time. | |
| ▲ | 2 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | throwaway2037 2 days ago | parent | prev [-] | | Yeah, there is no way that the majority of EU-based developers can retire at 45 on their meagre salaries. |
|
| |
| ▲ | jltsiren 3 days ago | parent | prev [-] | | The exponential growth has been something like 3-4%/year, or 2x in 20 years. Though it's hard to find useful statistics that take different job titles and changing nature of the industry properly into account. If you had asked me in 2010, I would have said that the median software developer lasts 5-10 years in the industry. A lot of people left the field after the dot-com bubble burst. The same happened again in a smaller scale in the late 2000s, at least in countries where the financial crisis was a real-world event (and not just something you heard about in the news). But now there has been ~15 years of sustained growth, and the ratio of mid-career developers to juniors may even be higher than usual. |
|
|
|
|
| ▲ | WalterBright 2 days ago | parent | prev | next [-] |
| It often takes 10 years or more of use before you discover that your technique is execrable. For example, most use of macros, especially when you invent your own language implemented as macros. |
| |
| ▲ | WalterBright 2 days ago | parent [-] | | A lot of people disagree with me on that. Wait till they try to get other people to understand their code :-/ Me, I've eliminated nearly all the use of macros from my old C code that is still around. The code is better. I suspect that heavy use of macros is why Lisp has never become mainstream. | | |
| ▲ | kazinator 2 days ago | parent [-] | | Heavy use of macros could be why C went mainstream. Macros gave C efficient inline functions without anything having to be done in the compiler. Doing things like "#define velocity(p) (p)->velocity" would instantly give a rudimentary C compiler with no inline functions a performance advantage over a crappy Pascal or Modula compiler with no inline functions, while keeping the code abstract. And of course #if and #ifdef greatly help with situations where C does not live up to its poorly deserved portability reputation. In languages without #ifdef, you would have to clone an entire source file and write it differently for another platform, which would cause a proliferation due to minor differences (e.g. among Unixes). Ah, speaking of which; C's #ifdef allowed everyone to have their own incompatible flavor of Unix with its own different API's and header files, yet get the same programs working. An operating system based on a language without preprocessing would have hopelessly fragmented if treated the same way, or else stagnated due to discouraging local development. Thanks in part to macros, Lisp people were similarly able to use each other code (or at least ideas) in spite of working on different dialects at different sites. | | |
| ▲ | WalterBright 2 days ago | parent [-] | | You're quite right in that early C was a primitive compiler, and adding a macro processor was a cheap and easy way to add power. Using the macro preprocessor to work around some fundamental issues with the language is not what I meant. I meant devising one's own language using macros. The canonical example: #define BEGIN {
#define END }
We laugh about that today, but the 80's people actually did that. Today's macros are often just more complicated attempts at the same thing.The tales I hear about Lisp is that a team's code is not portable to another team, because they each invent their own macro language in order to be able to use Lisp at all. | | |
| ▲ | lor_louis a day ago | parent | next [-] | | To be fair, I'd rather type BEGIN instead of <<? Or whatever the trigraph is supposed to be. We tend to forget that a lot of computers didn't have the keys to type "mathematical" symbols. | | |
| ▲ | WalterBright a day ago | parent [-] | | EBCDIC was already dead in the 1980s. Nobody ever used the trigraphs except for one company that hung on for years until even the C++ community decided enough was enough and dumped them. Besides, what people wrote was: #define BEGIN {
not: #define BEGIN <<?
|
| |
| ▲ | kazinator a day ago | parent | prev [-] | | Stephen Bourne used such macros in the Bourne Shell sources to make the code resemble Algol. The source is very clear and readable. | | |
| ▲ | WalterBright a day ago | parent [-] | | Have you adopted his macros in your own projects? | | |
| ▲ | kazinator a day ago | parent [-] | | No because even if I could identify a benefit to these macros (which I can't in the contexts in which I work) there's a cost to using them. Macros whuch simply transliterate tokens to other tokens without performing a code transformation do not have a compelling technical benefit. Only a non-technical benefit to a peculiar minority of users. In terms of cost, the readability and writeability are fine. What's not fine is that the macros will confuse tooling which processes C code without necessarily expanding it through the preprocessor. Tooling like text editing modes,identifier cross-referencers and whatnot. I've used C macros to extend a language with constructs like exception handling. These have a syntax that harmonizes with the language, making then compatible with all the tooling I use. There's a benefit because the macro expansions are too verbose and detailed to correctly repeat by hand, not to mention to correctly update if the implementation is adjusted. |
|
|
|
|
|
|
|
| ▲ | jrk 3 days ago | parent | prev | next [-] |
| Rust was started in 2006 and launched publicly, I believe, in 2009, the same year as Go. The point stands that these are still fairly new, but it’s not nearly that new. |
| |
| ▲ | ChrisSD 3 days ago | parent | next [-] | | Rust 1.0 was released in 2015 making it almost ten years old. Rust, unlike Go, was largely developed in public. It also changed significantly between it's initial design and 1.0 so it feels like "cheating" to count pre-release versions. Still, a decade is a significant milestone. | | |
| ▲ | ternaryoperator 2 days ago | parent [-] | | That's right. One of the knocks on those early versions was that every new release broke previous code in significant ways. Which is one reason that v. 1.0 was so important to the community. They could finally commit code using a stable language. |
| |
| ▲ | cmrdporcupine 3 days ago | parent | prev [-] | | Early Rust was a very different beast. But could say the same about Python pre-1995 or so. My biggest problems with Rust, though, are Cargo and Crates.io, not the language. | | |
| ▲ | dhosek 2 days ago | parent | next [-] | | Weird, cargo and crates.io is why I ended up deciding on Rust for developing finl rather than C++. The lack of standardized build/dependency management in C++ was a major pain point. | |
| ▲ | anacrolix 3 days ago | parent | prev | next [-] | | crates and cargo are better than Rust actual | | | |
| ▲ | gary_0 2 days ago | parent | prev [-] | | Cargo and crates.io are something C/C++ developers would kill for (which is why cargo is what it is, I think). | | |
| ▲ | cmrdporcupine 2 days ago | parent [-] | | Just because there is an absolute shitshow for C/C++ build systems doesn't automatically make Cargo & Crates.io good. There is a fundamental philosophical disagreement I have with the NPM style of package management and this method of handling dependencies. Like NPM, Crates.io is a chaotic wasteland, destined for a world of security & license problems and transitive dependency bloat. But honestly I'm sick of having this out on this forum. You're welcome to your opinion. After 25 years of working, with various styles of build and dependency management: I have mine. | | |
| ▲ | gary_0 2 days ago | parent [-] | | I wasn't disagreeing with you. My comment was implying that cargo (and arguably rust itself to some extent) was kind of a knee-jerk response to the insane parts of C/C++, for better and also for worse. | | |
| ▲ | cmrdporcupine 2 days ago | parent [-] | | Ok that's fair, sorry to be defensive. But I actually think it's more inspired by people coming from the NodeJS ecosystem than people coming from C++. |
|
|
|
|
|
|
| ▲ | roenxi 2 days ago | parent | prev [-] |
| Well "elders" are the people who have been there for the most amount of time, so if the industry has >30 year veterans wandering around then the elders will have around that much experience. But the learning in an industry is generally logarithmic where most of the lessons get picked up in the first 1-3 years and then after that there are only occasional new things to pick up. If anything software seems to be unusually favourable to experience because the first 5 years of learning how to think like a computer is so punishing. |