Remix.run Logo
Skyy93 3 hours ago

This article makes no real sense to me.

>You think of something new and express it - through a prompt, through code, through a product - it enters the system. Your novel idea becomes training data. The sheer act of thinking outside the box makes the box bigger.

This was the same before, if you had a novel idea and make a product out of it others follow. Especially for LLMs, they are not (till now) learning on the fly. Claude Opus 4.6 knowledge cut off was August 2025, so every idea you type in after this date is in the training data but not available, so you only have to be fast enough. Especially LLMs/AI-Agents like Claude enable this speed you need for bringing out something new.

The next thing is that we also have open source and open weight models that everyone of use with a decent consumer GPU can fine-tune and adapt, so its not only in the hands of a few companies.

>We will again build and innovate in private, hide, not share knowledge, mistakes, ideas.

Why should this happen? The moment you make your idea public, anyone can build it. This leads to greater proliferation than before, when the artificial barrier of having to learn to code prevented people from getting what they wanted or what they wanted to create.

RajT88 3 hours ago | parent | next [-]

> This was the same before, if you had a novel idea and make a product out of it others follow.

You've almost captured the full picture of it.

If you have a great idea, it's not going to be self-evidently enough of a great idea until you've proved it can make money. That's the hard part which comes at great personal, professional and financial risk.

Algorithms are cheap. Sure, they could use your LLM history to figure out what you did. Or the LLM could just reason it out. It could save them some work, sure.

But again - the hard part is not cloning the product, it's stealing your customers. People don't seem to be focused on the hard parts.

hrimfaxi 2 hours ago | parent | next [-]

> But again - the hard part is not cloning the product, it's stealing your customers. People don't seem to be focused on the hard parts.

Big companies seem to be bad at innovating but really, really good at enterprise sales.

zar1048576 25 minutes ago | parent [-]

I don’t know if that’s necessarily true. I do think that a big part of enterprise sales involves building a comprehensive solution that works well within the customer’s ecosystem. Start-ups usually tend to build point products, which have value, but are still missing functionality (even if that functionality is not scintillating) that customers really desire to easily deploy and maintain solutions. Also, customers do care about things like stability of their vendors and the level of available support.

oh_my_goodness 2 hours ago | parent | prev [-]

Yeah, and the big guys can't steal your customers. What a crazy idea.

RajT88 2 hours ago | parent [-]

The point is - they're going to do that anyways if they want to. Owning the LLM platforms makes it marginally cheaper to do so.

It's not the risk it's being made out to be.

oh_my_goodness an hour ago | parent [-]

Absolutely. The fact that they know your app better than you do, and that they can revoke your ability to develop it at any moment, those are just details. Those things won't change the game at all.

satvikpendem an hour ago | parent [-]

Unless you're using their API (in which case there's always platform risk, same as before), this is not an issue. There are lots of half assed implementations of ideas by the big companies that smaller companies run circles around, Innovator's Dilemma was literally written about this.

oh_my_goodness 4 minutes ago | parent [-]

In my opinion Christensen wasn't talking about outsourcing your entire development process to a competitor with much deeper pockets, giving them the ability to turn off your development at will [1], and then running rings around them. I'm sure you're familiar with his story about Dell and Asus.

[1] Unless you're assuming that you maintain control over your technology while outsourcing most of the development thinking to a rented AI?

middayc 2 hours ago | parent | prev [-]

> This was the same before, if you had a novel idea and make a product out of it others follow. Especially for LLMs, they are not (till now) learning on the fly. Claude Opus 4.6 knowledge cut off was August 2025, so every idea you type in after this date is in the training data but not available, so you only have to be fast enough. Especially LLMs/AI-Agents like Claude enable this speed you need for bringing out something new.

You have a point about the update intervals and the higher speed they provide to developers. But you are talking about now, and I was making a thought experiment - about a potential future. LLM-s are not learning on the fly, but I suspect they do log the conversations, their responses and could also deduce from further interaction if a particular response was satisfactory to the user. So in a world where available training data is drying up, nobody is throwing all this away. Gemini even has direct upvote/downvote on responses. Algorithms will probably improve, and the intervals will probably shorten.

Given the detailed information that all the back and forwards generate - I think it's not hard to use similar technology to track underlying trends, get all the problems associated with them and all the solution space that is talked about - and generate the solution before even the ones who thought of it release it. Theoretically :)

I think the open development will become less open. I don't like it - but I think it's already happening. First - all the blogs and forums moved to specialized platforms (SO, discords, ..) and now event some of those are d(r)ying. If people (in extreme cases) don't even read the code they produce, why would they read about the code, discuss the code, that's not even in their care. That is without the theoretical fear of the global Borg slurping all they write.