▲ | dangus 5 days ago | ||||||||||||||||
I think you're super wrong about the local model issue and that's a huge risk for companies like OpenAI. Apple products as an example have an excellent architecture for local AI. Extremely high-bandwidth RAM. If you run an OSS model like gpt-oss on a Mac with 32GB of RAM it's already very similar to a cloud experience. | |||||||||||||||||
▲ | chii 4 days ago | parent [-] | ||||||||||||||||
i dont have the hardware to run or try them, but from the huggingfaces discussion forums, gpt-oss seems to be pretty hard censored. I would not consider it as being a viable self-hosted LLM except for the very narrowest of domains (like coding for example). | |||||||||||||||||
|