▲ | cbsmith 4 days ago | |||||||
So the idea is that it SHOULD cost OpenAI a trillion dollars to do what you can accomplish with a potato? | ||||||||
▲ | giancarlostoro 4 days ago | parent [-] | |||||||
No, not even sure how you arrived to that conclusion. The idea is that there are models out there that can run on small amounts of VRAM. If all it costs is charging your phone, as opposed to some subscription to some overvalued AI company, people will choose ‘free’ first. We have models that can google things now. They only need to know so much when online, and a specific subset when offline. | ||||||||
|