| ▲ | webkmsyed 5 hours ago | |||||||
That’s a fair limitation. Legal and medical advice can directly impact someone’s life or safety, so AI tools must stay within ethical and regulatory boundaries. It’s better for AI to guide people toward professional help than pretend to replace it. | ||||||||
| ▲ | eCa an hour ago | parent | next [-] | |||||||
> AI tools must stay within ethical and regulatory boundaries. It’s better for AI to guide people toward professional help than pretend to replace it. Both of those ships have _sailed_. I am not allowed to read the article, but judging from the title, they have no issues giving _you_ advice, but you can’t use it to give advice to another person. | ||||||||
| ▲ | ryandrake 4 hours ago | parent | prev | next [-] | |||||||
Add financial advice to it, too. Really, any advice. Why the fuck are people asking a probabilistic plagiarizing machine for advice on anything? This is total insanity! | ||||||||
| ||||||||
| ▲ | jesterson 2 hours ago | parent | prev | next [-] | |||||||
> Legal and medical advice can directly impact someone’s life or safety, so AI tools must stay within ethical and regulatory boundaries Knives can be used to cook food and stab other people. By your suggestion, knives must be forbidden/limited as well? If people following chatgpt advise (or any other stupid source for that matter), it's a not a ChatGPT but the people, issue. | ||||||||
| ▲ | hsbauauvhabzb 3 hours ago | parent | prev [-] | |||||||
You had me right up until ‘AI tools must stay within ethical and regulatory boundaries’. I guarantee you any AI LLM company which cares about ethics is destined to fail, because none of their peers do. | ||||||||