▲ | chii 6 days ago | |||||||||||||||||||||||||
> They need to be significantly better to shift consumer habits. i am hoping that a device local model would eventually be possible (may be a beefy home setup, and then an app that connects to your home on mobile devices for use on the go). currently, hardware restrictions prevent this type of home setup (not to mention the open source/free models aren't quite there and difficulty for non-tech users to actually setup). However, i choose to believe the hardware issues will get solved, and it will merely be just time. The software/model issue, on the other hand is harder to see solved. I pin my hopes onto deepseek, but may be meta or some other company will surprise me. | ||||||||||||||||||||||||||
▲ | dangus 5 days ago | parent [-] | |||||||||||||||||||||||||
I think you're super wrong about the local model issue and that's a huge risk for companies like OpenAI. Apple products as an example have an excellent architecture for local AI. Extremely high-bandwidth RAM. If you run an OSS model like gpt-oss on a Mac with 32GB of RAM it's already very similar to a cloud experience. | ||||||||||||||||||||||||||
|