| ▲ | emp17344 3 days ago | |
But it’s actively unhelpful in explaining the phenomenon, as there is no justification for equivocating LLM and human behavior. It’s just confusing and misleading. | ||
| ▲ | danielmarkbruce 18 hours ago | parent [-] | |
This is obviously wrong. LLMs are trained on material humans created. Everything they output is a result of a human input, even if not a direct result. | ||