Remix.run Logo
ageitgey 5 hours ago

> that makes claude code or codex accessible to the average user

That's what they aim Claude Cowork at. Every executive/leader I've shown Claude Cowork to has gone from 'what is AI' to 'vibecoding whole apps' in weeks. Then when Claude is down for an hour, they get visibly angry and don't remember how to do anything pre-Claude :)

I understand the impulse to provide a UI to manage codebases, etc. But my observation is that these people just ask Claude to do whatever it is they need done. Codebase needs managing? They just ask Claude to do it. No idea how to deploy an app? They just ask Claude to do it.

Any app built on top of this stack to 'make it easier' is competing with 'I don't care what's happening, just ask Claude to do it'.

darepublic an hour ago | parent | next [-]

I have seen people just generate large docs with Claude cowork and they themselves have not scrutinized it or know why/how it's useful. It's just kind of impressive in its volume and well formatedness. And then they dump it in your lap as being helpful

nicce 5 hours ago | parent | prev | next [-]

> Then when Claude is down for an hour, they get visibly angry and don't remember how to do anything pre-Claude :)

The drug is scary when everyone is depending on it. I wonder what is future like.

puchatek 3 hours ago | parent | next [-]

The future is perpetually dealing with the fallout from all the vibe coding as the pool of people who'd have a shot at fixing it gets smaller and smaller. Shitty will be the new normal.

freetanga 3 hours ago | parent | next [-]

I feel like it will be like going back to the 80s, when PCs became a norm and most programmers and hobbyists could code without the need of a University or a Corporation. Thousands of shareware apps you had to navigate, everyone trying to solve the same problems from different angles..

I do agree quality will be missed, and shadow IT will be again a big issue like at the end of the 80s and early 90s.

pjmlp 19 minutes ago | parent | next [-]

Coding on 8 and 16 bit home computers still required some skills that most vibe coders certainly lack.

xienze 32 minutes ago | parent | prev [-]

> most programmers and hobbyists could code without the need of a University or a Corporation.

I don't think so. Back then, the pool of people doing such a thing basically self-selected for intelligent, motivated types who were capable of learning on their own. The new "programmers" "programming" via Claude Code are going to be very different from those hobbyists you're talking about.

ElFitz 3 hours ago | parent | prev [-]

> Shitty will be the new normal.

I’ve heard the same from the best devs, and some who thought themselves to be the best, I’ve known long before LLMs were ever a thing.

I’m sure others heard the same when JavaScript and Python became near ubiquitous. When PHP emerged. When C supplanted Fortran and COBOL. When these two took over from Assembly. When punch cards went the way of the dodo.

There’s always someone for whom shitty is becoming the new normal. If that makes it a rule, what do we make of that rule?

StilesCrisis 8 minutes ago | parent | next [-]

> I’m sure others heard the same when JavaScript and Python became near ubiquitous. When PHP emerged.

You heard right! Most JavaScript and PHP in the world _is_ profoundly shitty. It's taken 20 years of intense research to make JavaScript compilers that are almost good enough to mostly optimize away the design foibles of the language.

microtonal 3 hours ago | parent | prev | next [-]

There are different magnitudes of shitty.

Also we went from compilers with an IDE that had a debugger, profiler, built-in help and would fit on a 3.5" disk and would load on machines with 640KiB RAM (Turbo Pascal) to chat apps or password managers that are hundreds of megabytes and regularly gobble up more than a gigabyte of memory because they ship with their own browser.

Something is lost along the way.

sersi 3 hours ago | parent | prev | next [-]

To be fair with how powerful our computers are, it's a pity that electron apps like bitwarden, spotify are so slow and consume so much resources. I do miss the time when a lot of apps were snappy

freetanga 3 hours ago | parent | prev | next [-]

Maybe it’s a process. Many of the transitions you mentioned did bring shitty apps (not all of them, the ones replacing tech for tech were mostly ok, the ones democratizing dev did come with a quality drop), but eventually Darwinism will take effect and trim the long tail.

Coding per se is not hard. Proper engineering is. I do hope this change brings a change in focus (people train in algorithms, efficiency, solid development patterns) but I am afraid it won’t be the case.

Saline9515 3 hours ago | parent | prev [-]

"With a punchcard at least, I can verify what the input is! Unlike those new 'transistors' that are so unreliable!"

mercanlIl an hour ago | parent | prev | next [-]

I think there are some pretty good ways to understand it now.

When the electricity goes out, (most) people get similarly upset. No electricity means no internet, and all of a sudden everything that people had planed to do can’t be done until the power returns.

tyre 5 hours ago | parent | prev | next [-]

Same as anything else. It’ll go down sometimes, people will take a break and chat, then it will come back up.

Like Slack or GitHub or AWS or whatever. It’s almost always a net positive to wait vs do it yourself.

jappgar 36 minutes ago | parent | next [-]

What about when you work at Anthropic?

lukan 4 hours ago | parent | prev | next [-]

I think the scenario was more of, if really everyone depends on claude, then better nothing critical(medical software, aviation, traffic controll ..) breaks while claude is offline.

andrewl-hn 2 hours ago | parent | next [-]

At least some of the projects in these industries now specify strict no-AI-use policies in contracts. I participate in a few of these, and it’s becoming a bit of a pain, because all dev tool vendors insist on adding AI features, and if there’s no way to turn them off completely we have to migrate away.

However, the temptation of productivity gains are strong, and few of the customers look into relaxing these rules.

jebus989 2 hours ago | parent | prev [-]

The good thing is we've learned this already from cloud. When one AWS region is degraded we all failover to other regions, and then other cloud providers, right? ...right?

vrganj 4 hours ago | parent | prev [-]

I'm more scared at everyone outsourcing their thinking to a private, for-profit company.

What could possibly go wrong.

dns_snek 3 hours ago | parent [-]

Thinking, yes, but also secrets, access and effective control of important services in every country and company worldwide, centralized in the US (or anywhere else) where the NSA can take the driver's seat at any time. "AI" is the ultimate sleeper agent.

nz an hour ago | parent [-]

I have been saying things to this effect for a few years now, and have literally been laughed at. I feel like that guy that suggested that doctors should wash their hands before operating on patients -- they laughed at him too, before they put him in an asylum. What's going to happen, is that everyone who realizes that these policies are a mistake, is going to quietly retcon their own role in that mistake, while scapegoating everyone that they don't like.

Also, would bet money that the derived data from the meeting-summarizers is being sold to hedge-funds, to give them a bit of an edge.

FridgeSeal 24 minutes ago | parent [-]

> Also, would bet money that the derived data from the meeting-summarizers is being sold to hedge-funds, to give them a bit of an edge.

And if it isn't already, you can be that they're probably to start.

All those "difficult to program but easy-if-time-consuming-for-human" tasks, will 1000% be farmed out to models at unprecedented scales.

ValentineC 3 hours ago | parent | prev | next [-]

> The drug is scary when everyone is depending on it. I wonder what is future like.

I can't wait for a Hollywood blockbuster that'll pretty much be science non-fiction.

dheera 3 hours ago | parent | prev | next [-]

> wonder what is future like

Probably "don't do anything to upset AI companies or you will effectively become a handicapped person"

Not that different from life in China: "don't do anything to upset Tencent and AliPay or you will become an outcast"

Or life in the US if you're a content creator: "don't do anything to upset Meta or Youtube or you will not be able to pay your rent"

The future: ToS basically becomes law, and you will be stripped of your own second brain if you violate it or say anything they deem "sensitive"

oulipo2 3 hours ago | parent | prev | next [-]

Full of security holes

BoredPositron an hour ago | parent | prev | next [-]

same was said about electricity.

safety1st 5 hours ago | parent | prev | next [-]

Seems far less scary to me than, say, building an electrical grid in a cold climate, where if it fails for a few days people start to die. Oh wait...

jappgar 32 minutes ago | parent | next [-]

which is more likely when they start vibe-coding grid managers

M95D 3 hours ago | parent | prev | next [-]

Why would they die in cold climate? I would expect them to die in hot climate (no AC - heat stroke, no refrigerator - food poisoning), not the cold where they would have wood/gas heating.

coldtea 3 hours ago | parent | prev [-]

It's the same, on steroids.

M95D 3 hours ago | parent | prev [-]

Imagine what happens if computers stop working* and you have to go back to pen and paper for a few days.

* ransomware attack, fire in the server room, database HDD crash, car accident takes out the internet connection, ...

ElFitz 3 hours ago | parent | prev | next [-]

> I understand the impulse to provide a UI to manage codebases, etc. […] 'I don't care what's happening, just ask Claude to do it'.

Reading the first part, I was going to say they don’t even care about whether or not there’s a codebase. It doesn’t matter; it could be all gremlins and hamsters in wheels for all they care, and for all they should care. All that matters is the functionality, the value it gives them.

We’re even getting disposable code now. Entire single-use ephemeral web apps, built on the go to enable, visualise, or simplify a specific thing, then thrown away.

Will it all lead to some trouble? Definitely. So did computers, and so did the internet.

Weird times. Fun times.

rahoulb 3 hours ago | parent [-]

When I quit my day job and started Rails freelancing a big chunk of my work was from companies with "that tech guy" who had built a database in Microsoft Access that was vital to the department's operations. And then either left the company - or the app had started to fall apart under its own weight.

I would get called in to rewrite it, using a proper database, documented rules and ensure it stayed scalable - and everyone would be happy.

These Access "apps" were abominations from a technical point of view - but they got the job done without having to spend a load of money on off-the-shelf or bespoke software. And the "tech guy" made a valuable contribution to the company. It's only at a certain point that Access started to struggle.

I foresee the exact same thing happening in the near future - except we won't be building the replacement apps ourselves - we'll just know how to give the coding agents well-specified prompts and tell them when they're making a mistake.

mattmanser 3 hours ago | parent [-]

But at least you could basically follow their logic.

I think what a lot of us are concerned about is that the vibe-coded stuff bloats fast. It's so verbose and all over the place, that picking that thing apart will be a huge job, and relying on an AI to pick apart work that an AI already failed to maintain seem like wishful thinking.

It's literally "The AI is failing! Don't worry I'll just use AI to fix the AI!".

rahoulb 3 hours ago | parent | next [-]

The worst I would ever get was "here's our Access database - can you rewrite it". That was utterly useless to me.

What I needed to do was sit with a user (not a manager/the person buying my services) and ask them to show me the different things they did with the software. Then I could write a spec for the actual _feature_ and would only need to look at the existing codebase if they needed data transferring across[1]. I don't see why our new LLM-based future would be any different

[1] Of course this meant I would leave out edge-cases and/or weird quirks of the system - often this was actually a bonus as they were either no longer relevant or worked that way because that was the only way they knew how to do it

sersi 3 hours ago | parent | prev [-]

Yes, as long as context size increase and llm improve at least there's a way out through using AI but once the progress stops...

Ucalegon 3 hours ago | parent | prev | next [-]

>Every executive/leader I've shown Claude Cowork to has gone from 'what is AI' to 'vibecoding whole apps' in weeks.

Do you, and those executives, own the risks associated with that practice? Are those risks actually indemnified?

Its neat that 'anyone can do anything' but if they don't actually know what the risk to business or 3rd parties, why is this a good thing, especially in the enterprise where there are actors who are explicitly looking for this type of environment to exploit?

ageitgey 2 hours ago | parent | next [-]

These are largely friends and peers, so they ultimately own their own risks. But I'm not saying it is good or bad. I'm just telling you what is happening in the real world. Every senior person I know, whether a high tech exec or a solo coffee bean importer, is vibing to some degree. Some will be more successful than others.

I've been working in tech since the late 90s. This is the biggest and most sudden change in company behavior I've ever seen. The only thing that comes close was the web 1.0 world in the 90s where everything suddenly became websites.

That creates tons of risks and opportunities. Good and bad. Maybe a great time to start a security company. But maybe a terrible time to be a small time web app developer when your clients can get 'good enough' in minutes for dollars on their own.

Ucalegon 2 hours ago | parent [-]

>But I'm not saying it is good or bad.

Wait, you exposed people to a technology, taught them how to use it, then you are not going to own the implications of that action without teaching them about the risks or telling them how they need to ensure they don't shoot themselves in the face or violate their duty of care?

Do you understand what you are saying and the implications of that in the real world relative to the insurance contracts that they have?

Your company is associated with HIPAA, you should have a much higher standard than this.

tclancy 2 hours ago | parent | next [-]

Play the ball, not the man, dude. Hectoring people on the Internet because you're stressed out about something isn't going to magically fix how you feel. Digging into their profile to make it personal is three steps too far.

Ucalegon an hour ago | parent [-]

We are talking about one person's introduction of a technology to persons and the implications of that action within the framework of enterprise governance and risk, it is one in the same. If anything, who a person is, their knowledge of the domain and the associated implications that action has on the domain has relevancy where someone who is ignorant of implications may have more grace than someone who has the experience to know better. The passive lack of accountability or responsibility relative to that does matter given the context.

tclancy 3 minutes ago | parent | next [-]

What we are talking about is the conclusion you leapt to from 20 seconds of looking for evidence to suit a conclusion. Nothing in their comment "These are largely friends and peers, so they ultimately own their own risks" insists these are all people working in or on healthcare. Friends could be ... friends? Like the kind outside of work. And if someone is a peer (again, we have to assume the "at work" part), there isn't much you can do to prevent them from doing what they will. Educating them about trigger safety may be the best thing you can do.

foobar10000 39 minutes ago | parent | prev | next [-]

I think the one thing you are not taking into account is that the investors on average fundamentally don’t care. Scale arbitrage means that small companies are fundamentally about velocity - and if they get sued due to regulations that do not pierce the corporate veil, they just fold. And the ones that did not get sued make money for the vc. And figure out later how to be hipaa etc compliant. Basically, I’ve been seeing over the last 10 years VCs are not caring about insurance or corporate liability - sink rate is so high it is irrelevant.

For big corps - this is different. But modulo hipaa - this is why they are gung ho hi about binding arbitration - they are trying to match velocity to some degree - and mostly failing…

Ucalegon 15 minutes ago | parent [-]

VCs and investors are a massive issue, which is ironic saying that here, but once you get into contracts with other businesses, it changes things for the business and the leadership within who do carry liability when things go wrong, especially when they have made attestations.

dumfries an hour ago | parent | prev | next [-]

You have to understand that people like you, that you that keep talking about enterprise governance and risk, should facilitate business users to do these things securely. This should have always been the case but somehow it has ended up more with restricting rather than facilitating. Hopefully tools like claude code will prove the value add more easily, changing everything I hate about corp IT.

Ucalegon 42 minutes ago | parent [-]

I appreciate the feeling but this isn't so much driven by principle but by business risk through contract liability or other liability that exists within whatever place you happen to be doing business.

'Adding value' is a very interesting statement and way to judge the worth of something. Adding value to who? And if that value add also causes massive harms, how do we reconcile that? So you build a brand new app with does all of the things that all of your total addressable market wants, but it also exposes all of the IP your existing clients, does that mean you will be able to achieve that TAM?

Corp IT does not exist in a vacuum. Understanding the why of that isn't a 'you should just accept this' but more 'how can we make this better and avoid mistakes already made by others'. I will always point to aviation and 'bold text is written in blood' as a great model to understand all of this not as a blocker but, instead, as a building block.

criley2 an hour ago | parent | prev [-]

There is no way to facilitate untrained users in the healthcare space to vibe code real applications touching patient data. There is no magic policy, firewall, or "facilitation technique" which can make vibe coded software reliably meet contractual and regulatory obligations with a high degree of security in the healthcare space.

If you care about data privacy, especially your own protected health information, that sentence should give you a lot of comfort.

In a HIPAA environment, people who are sufficiently trained on how to develop regulated software securely are called "software engineers".

In my opinion, agents will replace the majority of the rest of businesses before they are good enough at agentic engineering to be able to autonomously develop software that safely and reliably can manage PHI without a single mistake.

It goes without saying: never trust your PHI to any company who is vibe coding in production.

infecto 33 minutes ago | parent [-]

You guys have jumped to so many conclusions it’s amazing.

25 minutes ago | parent [-]
[deleted]
ageitgey 2 hours ago | parent | prev [-]

You are assuming like 12 things that aren't true in this response.

Ucalegon 2 hours ago | parent [-]

Explicitly name them then.

baxtr 2 hours ago | parent | prev | next [-]

What kind of risk do you see?

Ucalegon 2 hours ago | parent [-]

Depends on what types of apps are being built, what data they touch, and what those apps are exposed to from a network perspective. Ie; all of the fundamentals of information/network security. Generally speaking, most executives do not have an information/network security background but do have privileged access to extremely valuable information, even if an attacker just has access to their email.

ninjagoo an hour ago | parent [-]

> most executives do not have an information/network security background but do have privileged access to extremely valuable information, even if an attacker just has access to their email.

In a properly structured organization, of which there are many and who are required by regulations and/or best practices, senior executives tend to have need/role-based access to information, just like everyone else in the organization. So they may have access to strategic business information, but not patient records or payroll. They may have access to planning data, but not the financial records of individual or clients. Etc. etc.

Smaller or newer orgs may not have this compartmentalization, but in general I think the principle holds true for orgs over a certain number of folks in size.

Ucalegon 28 minutes ago | parent [-]

I do not disagree with anything you said.

Generally, when it comes to 'privileged' information within an executives inbox it is business information or trust releastionships and not specific PII/PHI of an user. It was me being terrible at trying to impart that even the most begin seeming access may have major consequences even if it is not a total compromise of everything given the massive scope of 'what could happen' with executives vibe coding applications, like something managing their inbox past their EA, or something trivial seeming.

infecto 25 minutes ago | parent | prev [-]

I found the Microsoft guy!

morpheuskafka 4 hours ago | parent | prev | next [-]

> Any app built on top of this stack to 'make it easier' is competing with 'I don't care what's happening, just ask Claude to do it'.

To put it another way, the customers of these frontier models are implicitly being competed against by the model itself.

bandrami 2 hours ago | parent | prev [-]

Yeah I'm realizing now how many of you guys work in industries with no data security/protection requirements

senexox an hour ago | parent | next [-]

Exactly. The tools aren't the rate limiting factor for me. I can automate an entire department right now with Claude but I can't because of regulations and audits. Basically, turning an error prone manual process into a probabilistic process that Claude would do far more accurately in the end than what we do now. The process wouldn't be "repeatable" though by the letter of the regulation so would open the company up to automated regulatory violations and existential fines. The technical issues for me are trivial but the regulations are insurmountable. The bubble is in the TAM. My work is exactly who Claude for Small Business would be aiming at but we can't do anything with these tools because of regulation. That is a huge % of the economy.

bandrami an hour ago | parent [-]

For me the much bigger problem is the data (and God knows what else) going to a third party. But yeah the non-repeatability doesn't pass the DoD audits either.

newsclues 2 hours ago | parent | prev [-]

There are requirements they just don’t get enforced enough to matter