| ▲ | blitzar 6 hours ago | |
> tacit admission that Google is just better at this kind of thing Yet at the same time google have the worst offering of all the major players (all starting up out of thin air) in this space. It doesnt really matter anyway, the LLM is a commodity piece of tech, the interface is what matters and apple should focus on making that rather than worry about scraping the entire internet for training data and spending a trillion on GPUs | ||
| ▲ | lxgr 3 hours ago | parent [-] | |
> Yet at the same time google have the worst offering of all the major players (all starting up out of thin air) in this space. Is that so? Gemini Models (including Nano Banana), in my experience, are very good, and are kneecapped only by Google’s patronizing guardrails. (They will regularly refuse all kinds of things that GPT and Claude don’t bat a weight at, and I can often talk them out of the refusal eventually, which makes no sense at all.) That’s not something Apple necessarily has to replicate in their implementation (although if there’s one company I’d trust to go above and beyond on that, it’s Apple). | ||