Remix.run Logo
omnicognate 16 hours ago

> Software development jobs must be very diverse if even this anti-vibe-coding guy thinks AI coding definitely makes developers more productive.

As a Professor of English who teaches programming to humanities students, the writer has had an extremely interesting and unusual academic career [1]. He sounds awesome, but I think it's fair to suggest he may not have much experience of large scale commercial software development or be particularly well placed to predict what will or will not work in that environment. (Not that he necessarily claims to, but it's implicit in strong predictions about what the "future of programming" will be.)

[1] https://stephenramsay.net/about/

godelski 14 hours ago | parent | next [-]

Hard to say but to back his claim that he was programming since the 90's his CV shows he was working on stuff that's clearly more than your basic undergraduate skill level since the early 2000's. I'd be willing to bet he has more years under his belt than most HN users. I mean I'm considered old here, in my mid 30's, and this guy has been programming most my life. Though that doesn't explicitly imply experience, or more specifically experience in what.

That said, I think people really under appreciate how diverse programmers actually are. I started in physics and came over when I went to grad school. While I wouldn't expect a physicist to do super well on leetcode problems I've seen those same people write incredible code that's optimized for HPC systems and they're really good at tracing bottlenecks (it's a skill that translates from physics really really well). Hell, the best programmer I've ever met got that way because he was doing his PhD in mechanical engineering. He's practically the leading expert in data streaming for HPC systems and gained this skill because he needed more performance for his other work.

There's a lot of different types of programmers out there but I think it's too easy to think the field is narrow.

mikewarot 12 hours ago | parent | next [-]

>I'm considered old here, in my mid 30's

I'm 62, and I'm not old yet, you're just a kid. ;-)

Seriously, there are some folks here who started on punch cards and/or paper tape in the 1960s.

wombatpm 8 hours ago | parent | next [-]

I played with punch cards and polystyrene test samples from the Standard Oil Refinery where my father worked in the early 70’s and my first language after basic was Fortran 77. Not old either.

freeopinion 9 hours ago | parent | prev | next [-]

30 years ago my coworkers called me Grandpa, so I get it both ways.

godelski 11 hours ago | parent | prev [-]

Thanks. I meant is more of in a joking way, poking fun at the community. I know I'm far too young to earn a gray beard, but I hope to in the next 20-30 years ;-) I still got a lot to learn till that happens

Aeolun 11 hours ago | parent [-]

You wish, that gray beard sometimes appears in your late thirties.

godelski 10 hours ago | parent [-]

Maybe. But also what I though was a gray beard in my early 20's is very different from what I think a gray beard is now. The number of those I've considered wizards decreased, and I think this should be true for most people. It's harder to differentiate experts as a novice, but as you get closer the resolution increases.

jader201 5 hours ago | parent [-]

The more I know, the more I know I don’t know.

popcorncowboy 4 hours ago | parent [-]

...and the more I know you don't know. [On the disappearance of wizards as you age]

godelski 42 minutes ago | parent [-]

Both definitely contribute. But at the same time the people who stay wizards (and the people you realize are wizards but didn't previously) only appear to be more magical than ever.

Some magic tricks are unimpressive when you know how they are done. But that's not true for all of them. Some of them only become more and more impressive, only truly being able to be appreciated by other masters. The best magic tricks don't just impress an audience, they impress an audience of magicians.

pjmlp 20 minutes ago | parent | prev | next [-]

My first home computer was bought in 1986, before that the only electronics at home were Game & Watch handhelds, like Manhole.

I guess I am reaching Gandalf status then. :)

anthk 2 hours ago | parent | prev | next [-]

38 there. If you didn't suffer Win9x's 'stability', then editing X11 config files by hand, getting mad with ALSA/Dmix, writing new ad-hoc drivers for weird BTTV tuners reusing old known ones for $WEIRDBRAND, you didn't live.

groovy2shoes 11 minutes ago | parent [-]

the anxiety that i might fry my monitor by setting the wrong scan rate haunts me to this day

AceJohnny2 13 hours ago | parent | prev | next [-]

> I mean I'm considered old here, in my mid 30's

sigh

bojo 12 hours ago | parent | next [-]

I feel like a grandpa after reading that comment now.

jjgreen 12 hours ago | parent | prev | next [-]

I got a coat older than that (and in decent nick).

LgWoodenBadger 11 hours ago | parent [-]

I used to tell the “kids” that I worked with that I have a bowling ball older than them.

wombatpm 8 hours ago | parent | next [-]

I was greeted with blank stares by the kids on my team when they wanted to rewrite an existing program from scratch, and I said that will work for as well as it did with Netscape. Dang whippersnappers

anthk 2 hours ago | parent | prev [-]

I own 90's comic books and video games older than most Gen-Z users in HN.

godelski 12 hours ago | parent | prev [-]

But am I wrong? I am joking, but good jokes have an element of truth...

omnicognate 11 hours ago | parent | next [-]

Depends what you mean by "old". If you mean elderly then obviously you're not. If you mean "past it" then it might reassure you to know the average expecting mother is in her 30s now (in the UK). Even if you just mean "grown up", recent research [1] on brain development identifies adolescence as typically extending into the early thirties, with (brain) adulthood running from there to the mid sixties before even then only entering the "early aging" stage.

For my part, I'm a lot older than you and don't consider myself old. Indeed, I think prematurely thinking of yourself as old can be a pretty bad mistake, health-wise.

[1] https://www.nature.com/articles/s41467-025-65974-8

godelski 10 hours ago | parent [-]

FWIW I doubt I'd consider you old were I to know your actual age. I still think I'm quite young

AceJohnny2 9 hours ago | parent [-]

"inside every old person there is a young one wondering what happened."

xupybd 7 hours ago | parent | prev | next [-]

I assume you're on the younger end

godelski 7 hours ago | parent [-]

No need to assume, I already told everyone my age

AceJohnny2 11 hours ago | parent | prev [-]

It'd be interesting the know the median age of HN commenters.

I guess the median age of YCombinator cohorts is <30 ?

7 hours ago | parent | prev [-]
[deleted]
assimpleaspossi 11 hours ago | parent | prev | next [-]

>As a Professor of English who teaches programming to humanities students

That is the strangest thing I've heard today.

jaimie 10 hours ago | parent [-]

The world of the Digital Humanities is a lot of fun (and one I've been a part of, teaching programming to Historians and Philosophers of Science!) It uses computation to provide new types of evidence for historical or rhetorical arguments and data-driven critiques. There's an art to it as well, showing evidence for things like multiple interpretations of a text through the stochasticity of various text extraction models.

From the author's about page:

> I discovered digital humanities (“humanities computing,” as it was then called) while I was a graduate student at the University of Virginia in the mid-nineties. I found the whole thing very exciting, but felt that before I could get on to things like computational text analysis and other kinds of humanistic geekery, I needed to work through a set of thorny philosophical problems. Is there such a thing as “algorithmic” literary criticism? Is there a distinct, humanistic form of visualization that differs from its scientific counterpart? What does it mean to “read” a text with a machine? Computational analysis of the human record seems to imply a different conception of hermeneutics, but what is that new conception?

https://stephenramsay.net/about/

moron4hire 11 hours ago | parent | prev | next [-]

That was such a strange aspect. If you will excuse my use of the tortured analogy of comparing programming to wood working, there are is a lot of talk about hand tools versus power tools, but for people who aren't in a production capacity--not making cabinets for a living, not making furniture for a living--you see people choosing to exclusively use hand tools because they just enjoy it more. There isn't pressure about "you most use power tools or else you're in self-denial about their superiority." Well , at least for people who actually practice the hobby. You'll find plenty of armchair woodworkers in the comments section on YouTube. But I digress. For someone who claims to enjoy programming for the sake of programming, it was a very strange statement to make about coding.

I very much enjoy the act of programming, but I'm also a professional software developer. Incidentally, I've almost always worked in fields where subtly wrong answers could get someone hurt or killed. I just can't imagine either giving up my joy in the former case or abdicating my responsibility to understand my code in the latter.

And this is why the wood working analogy falls down. The scale at which damage can occur due to the decision to use power tools over hand tools is, for most practical purposes, limited to just myself. With computers, we can share our fuck ups with the whole world.

Kostchei 9 minutes ago | parent | next [-]

so what you are saying is that for production we should use AI, and hand code for hobby, got it. Lemme log back into the vpn and set the agents on the Enterprise monorepo /jk

unsungNovelty 10 hours ago | parent | prev [-]

Nicely put. The wood working analogy does work.

ngc248 2 hours ago | parent | prev [-]

Exactly, I don't think ppl understand why programming languages even came about anymore. Lotsa ppl don't understand why a natural language is not suitable for programming and by extension prompting an LLM