Remix.run Logo
revolvingthrow 3 hours ago

If openai / anthropic / google were the only game in town then yea, we’d already be paying 5x as much as we do. But local models are so close to sota that it just isn’t going to happen. If I’m a lawyer getting billed $500k/yr on $600k profit I’d rather buy a chonky server and run a model that’s 90% as good and get my money back in 2 years, then pay $5k electricity on $600k profit.

Nobody will successfully lobby for banning local models either, it just isn’t going to happen when the rest of the world will happily avoid paying 80% of their profits to some US bigco for the privilege of existing.

cactusplant7374 31 minutes ago | parent [-]

Could you really build something sophisticated with a local model? Let's say a linux kernel.

realusername 27 minutes ago | parent [-]

I'm using Codex with the Linux kernel and I discard maybe 80% of what it produces. This isn't an area which the top models have solved.