Remix.run Logo
barrell 4 days ago

There are also a bunch of us who do kick the tires very often and are consistently underwhelmed.

There are also those of us who have used them substantially, and seen the damage that causes to a codebase in the long run (in part due to the missing gains of having someone who understands the codebase).

There are also those of us who just don’t like the interface of chatting with a robot instead of just solving the problem ourselves.

There are also those of us who find each generation of model substantially worse than the previous generation, and find the utility trending downwards.

There are also those of us who are concerned about the research coming out about the effects of using LLMs on your brain and cognitive load.

There are also those of us who appreciate craft, and take pride in what we do, and don’t find that same enjoyment/pride in asking LLMs to do it.

There are also those of us who worry about offloading our critical thinking to big corporations, and becoming dependent on a pay-to-play system, that is current being propped up by artificially lowered prices, with “RUG PULL” written all over them.

There are also those of us who are really concerned about the privacy issues, and don’t trust companies hundreds of billions of dollars in debt to some of the least trust worth individuals with that data.

Most of these issues don’t require much experience with the latest generation.

I don’t think the intention of your comment was to stir up FUD, but I feel like it’s really easy for people to walk away with that from this sort of comment, so I just wanted to add my two cents and tell people they really don’t need to be wasting their time every 6 weeks. They’re really not missing anything.

Can you do more than a few weeks ago? Sure? Maybe? But I can also do a lot more than I was able to a few weeks ago as well not using an LLM. I’ve learned and improved myself.

Chances are if you’re not already using an LLM it’s because you don’t like it, or don’t want to, and that’s really ok. If AGSI comes out in a few months, all the time you would have invested now would be out of date anyways.

There’s really no rush or need to be tapped in.

bigstrat2003 4 days ago | parent | next [-]

> There are also a bunch of us who do kick the tires very often and are consistently underwhelmed.

Yep, this is me. Every time people are like "it's improved so much" I feel like I'm taking crazy pills as a result. I try it every so often, and more often than not it still has the same exact issues it had back in the GPT-3 days. When the tool hasn't improved (in my opinion, obviously) in several years, why should I be optimistic that it'll reach the heights that advocates say it will?

barrell 4 days ago | parent [-]

haha I have to laugh because I’ve probably said “I feel like I’m taking crazy pills” at least 20 times this week (I spent a day using cursor with the new GPT and was thoroughly, thoroughly unimpressed).

I’m open to programming with LLMs, and I’m entirely fine with people using them and I’m glad people are happy. But this insistence that progress is so crazy that you have to be tapped in at all times just irks me.

LLM models are like iPhones. You can skip a couple versions it’s fine, you will have the new version at the same time with all the same functionality as everyone else buying one every year.

energy123 4 days ago | parent [-]

> new GPT

Another sign tapping is needed.

> AI is exceptional for coding! [high-compute scaffold around multiple instances / undisclosed IOI model / AlphaEvolve]

> AI is awesome for coding! [Gpt-5 Pro]

> AI is somewhat awesome for coding! ["gpt-5" with verbosity "high" and effort "high"]

> AI is a pretty good at coding! [ChatGPT 5 Thinking through a Pro subscription with Juice of 128]

> AI is mediocre at coding! [ChatGPT 5 Thinking through a Plus subscription with a Juice of 64]

> AI sucks at coding! [ChatGPT 5 auto routing]

bluefirebrand 3 days ago | parent [-]

Yeah, frankly if you have the free time to dig through all of that to find the best models or whatever for your use cases, good on you

I have code to write

3 days ago | parent [-]
[deleted]
libraryofbabel 4 days ago | parent | prev [-]

There’s really three points mixed up in here.

1) LLMs are controlled by BigCorps who don’t have user’s best interests at heart.

2) I don’t like LLMs and don’t use them because they spoil my feeling of craftsmanship.

3) LLMs can’t be useful to anyone because I “kick the tires” every so often and am underwhelmed. (But what did you actually try? Do tell.)

#1 is obviously true and is a problem, but it’s just capitalism. #2 is a personal choice, you do you etc., but it’s also kinda betting your career on AI failing. You may or may not have a technical niche where you’ll be fine for the next decade, but would you really in good conscience recommend a juniorish web dev take this position? #3 is a rather strong claim because it requires you to claim that a lot of smart reasonable programmers who see benefits from AI use are deluded. (Not everyone who says they get some benefit from AI is a shill or charlatan.)

barrell 4 days ago | parent [-]

How exactly am I betting my career on LLMs failing? The inverse is definitely true — going all in on LLMs feels like betting on the future success of LLMs. However not using LLMs to program today is not betting on anything, except maybe myself, but even that’s a stretch.

After all, I can always pick up LLMs in the future. If a few weeks is long enough for all my priors to become stale, why should I have to start now? Everything I learn will be out of date in a few weeks. Things will only be easier to learn 6, 12, 18 months from now.

Also no where in my post did I say that LLMs can’t be useful to anyone. In fact I said the opposite. If you like LLMs or benefit from them, then you’re probably already using them, in which case I’m not advocating anyone stop. However there are many segments of people who LLMs are not for. No tool is a panacea. I’m just trying to nip and FUD in the butt.

There are so many demands for our attention in the modern world to stay looped in and up to date on everything; I’m just here saying don’t fret. Do what you enjoy. LLMs will be here in 12 months. And again in 24. And 36. You don’t need to care now.

And yes I mentor several juniors (designers and engineers). I do not let them use LLMs for anything and actively discourage them from using LLMs. That is not what I’m trying to do in this post, but for those whose success I am invested in, who ask me for advice, I quite confidently advise against it. At least for now. But that is a separate matter.

EDIT: My exact words from another comment in this thread prior to your comment:

> I’m open to programming with LLMs, and I’m entirely fine with people using them and I’m glad people are happy.

saltcured 4 days ago | parent [-]

I wonder, what drives this intense FOMO ideation about AI tools as expressed further upthread?

How does someone reconcile a faith that AI tooling is rapdily improving with that contradictory belief that there is some permanent early-adopter benefit?

bluefirebrand 3 days ago | parent [-]

I think the early adopter at all costs mentality is being driven by marketing and sales, not any rational reason to need to be ahead of the curve

I agree very strongly with the poster above yours: If these tools are so good and so easy to use then I will learn them at that time

Otherwise the idea that they are saving me time is likely just hype and not reality, which matches my experience