Remix.run Logo
MountDoom a day ago

Ubiquity doesn't depend on the AI getting much better as much as it depends on the computational cost going down (i.e., better hardware + software optimizations). When you can put a ChatGPT-class model locally on every desktop or phone, people will use it even if the accuracy or safety isn't quite there.

Just look at how people are using Grok on Twitter, or how they're pasting ChatGPT output to win online arguments, or how they're trusting Google AI snippets. This is only gonna escalate.

That said, this is probably not the future Sam Altman is talking about. His vision for the future must justify the sky-high valuations of OpenAI, and cheap ubiquity of this non-proprietary tech runs counter to that. So his "ubiquity" is some sort of special, qualified ubiquity that is 100% dependent on his company.

beeflet a day ago | parent [-]

>When you can put a ChatGPT-class model locally on every desktop or phone, people will use it even if the accuracy or safety isn't quite there.

Will they though?

>Just look at how people are using Grok on Twitter, or how they're pasting ChatGPT output to win online arguments, or how they're trusting Google AI snippets. This is only gonna escalate.

But will they though?