| ▲ | mft_ 2 hours ago | |||||||||||||||||||||||||||||||
I suspect a possible future of local models is extreme specialisation - you load a Python-expert model for Python coding, do your shopping with a model focused just on this task, have a model specialised in speech-to-text plus automation to run your smart home, and so on. This makes sense: running a huge model for a task that only uses a small fraction of its ability is wasteful, and home hardware especially isn't suited to this wastefulness. I'd rather have multiple models with a deep narrow ability in particular areas, than a general wide shallow uncertain ability. Anyway, is it possible that this may be what lies behind Gemma 4's "censoring"? As in, Google took a deliberate choice to focus its training on certain domains, and incorporated the censor to prevent it answering about topics it hasn't been trained on? Or maybe they're just being sensibly cautious: asking even the top models for critical health advice is risky; asking a 32B model probably orders of magnitude moreso. | ||||||||||||||||||||||||||||||||
| ▲ | OutOfHere 2 hours ago | parent [-] | |||||||||||||||||||||||||||||||
> is it possible that this may be what lies behind Gemma 4's "censoring" Your explanation would make sense if various other rare domains were also censored, but they aren't, so it doesn't. > asking even the top models for critical health advice is risky Not asking, and living in ignorance, is riskier. For high-stakes questions, of course I'd want references that only an online model like ChatGPT or Gemini, etc. would be able to find. If I am asking a local model for health advice, odds are that it is because I am traveling and am temporarily offline, or am preparing off-grid infrastructure. In both cases I definitely require a best-effort answer. I also require the model to be able to tell when it doesn't know the answer. If you would, ignore health advice for a moment, and switch to electrical advice. Imagine I am putting together electrical infrastructure, and the model gives me bad advice, risking electrocution and/or a serious fire. Why is electrical advice not censored, and what makes it not be high-stakes!? The logic is the same. For the record, various open-source Asian models do not have any such problem, so I would rather use them. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||