| ▲ | jonah 5 hours ago | |||||||
The general public will self-host it's built in to your next phone or laptop straight out of the box or maybe from the App Store. | ||||||||
| ▲ | delecti 4 hours ago | parent | next [-] | |||||||
I agree that that's what it would take, but compute would need to get very cheap for it to be feasible to keep models running locally. That's an awful lot of memory to have just sitting with the model running in it. | ||||||||
| ▲ | winrid 5 hours ago | parent | prev [-] | |||||||
True. I was thinking more of power users. Do you think Opus level capabilities will run on your average laptop in a year? I think that's pretty far away if ever. | ||||||||
| ||||||||