| ▲ | dist-epoch 3 hours ago | |||||||
> Imagine I am putting together electrical infrastructure, and the model gives me bad advice, risking electrocution and/or a serious fire That's a weird demand from models. What next, "Imagine I'm doing brain surgery and the model gives me bad advice", "Imagine I'm a judge delivering a sentencing and the model gives me bad advice", ... | ||||||||
| ▲ | OutOfHere 2 hours ago | parent [-] | |||||||
Requesting electrical advice is not a weird ask at all. If writing sophisticated code requires skill, then so does electrical work, and one doesn't require more or less skill than the other. I would expect that the top-ranked thinking models are wholly capable of offering correct advice on the topic. The issues arise more from the user's inability to input all applicable context which can affect the decision and output. All else being equal, bad electrical work is 10x more likely to be a result of not adequately consulting AI than from consulting AI. Secondly, the primary point was about censorship, not accuracy, so let's not get distracted. | ||||||||
| ||||||||