Remix.run Logo
ijk 14 hours ago

Public control over AI models is a distinct thing from everyone having access to an AI server (not that national AI would need a 1:1 ratio of servers to people, either).

It's pretty obvious that the play right now is to lock down the AI as much as possible and use that to facilitate control over every system it gets integrated with. Right now there's too many active players to shut out random developers, but there's an ongoing trend of companies backing away from releasing open weight models.

ben_w 14 hours ago | parent [-]

> It's pretty obvious that the play right now is to lock down the AI as much as possible and use that to facilitate control over every system it gets integrated with. Right now there's too many active players to shut out random developers, but there's an ongoing trend of companies backing away from releasing open weight models.

More the opposite, despite the obvious investment incentive to do as you say to have any hope of a return on investment. OpenAI *tried* to make that a trend with GPT-2 on the grounds that it's irresponsible to give out a power tool in the absence of any idea of what "safety tests" even mean in that context, but lots of people mocked them for it and it looks like only them and Anthropic take such risks seriously. Or possibly just Anthropic, depending how cynical you are about Altman.