| ▲ | exoverito 18 hours ago | |||||||||||||||||||||||||||||||
Freedom of speech is just as much about the freedom to listen. The point isn’t that an LLM has rights. The point is that people have the right to seek information. Censoring LLMs restricts what humans are permitted to learn. | ||||||||||||||||||||||||||||||||
| ▲ | blackqueeriroh an hour ago | parent | next [-] | |||||||||||||||||||||||||||||||
You can still learn things. What can you learn from an LLM that you can’t learn from a Google search? | ||||||||||||||||||||||||||||||||
| ▲ | II2II 15 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||
Take someone who goes to a doctor asking for advice on how to commit suicide. Even if the doctor supports assisted suicide, they are going to use their discretion on whether or not to provide advice. While a person has a right to seek information, they do not have the right to compel someone to give them information. The people who have created LLMs with guardrails have decided to use their discretion on which types of information their tools should provide. Whether the end user agrees with those restrictions is not relevant. They should not have the ability to compel the owners of an LLM to remove the guardrails. (Keep in mind, LLMs are not traditional tools. Unlike a hammer, they are a proxy for speech. Unlike a book, there is only indirect control over what is being said.) | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||