▲ | gigatexal 9 days ago | |
M3 Max 128GB here and it’s mad impressive. Im spec’ing out a Mac Studio with 512GB ram because I can window shop and wish but I think the trend for local LLMs is getting really good. Do we know WHY openAI even released them? | ||
▲ | diggan 9 days ago | parent | next [-] | |
> Do we know WHY openAI even released them? Regulations and trying to earn good will of developers using local LLMs, something that was slowly eroding since it was a while ago (GPT2 - 2019) they released weights to the public. | ||
▲ | Epa095 8 days ago | parent | prev | next [-] | |
If the new gpt 5 is actually better, then this oss version is not really a threat to Openai's income stream, but it can be a threat to their competitors. | ||
▲ | lavezzi 8 days ago | parent | prev [-] | |
> Do we know WHY openAI even released them? Enterprises can now deploy them on AWS and GCP. |