| ▲ | l23k4 an hour ago | ||||||||||||||||
We had people acting out like this before LLM chatbots, correlation does not necessarily imply causation. | |||||||||||||||||
| ▲ | hansmayer an hour ago | parent | next [-] | ||||||||||||||||
We did...but it was few here and there. The LLMs are making it massive and impacting people on a huge scale. | |||||||||||||||||
| ▲ | embedding-shape an hour ago | parent | prev | next [-] | ||||||||||||||||
> correlation does not necessarily imply causation I feel like you're missing what you're replying to, why are you saying this? The article is about a person who "lost grip on reality", no one is saying LLMs is turning people into pope-wannabees as far as I can tell, you're reacting against something no one claimed. | |||||||||||||||||
| |||||||||||||||||
| ▲ | nephihaha an hour ago | parent | prev [-] | ||||||||||||||||
This is something new. Delusions were around before, certainly, but LLM offers a round the clock potential for psychological conditioning, which would not normally be possible without sustained attention by a group of people. | |||||||||||||||||