| ▲ | cj 3 hours ago | |
I think that’s a valid stance to take. IMO it’s (unfortunately) the public’s responsibility to learn the lesson that LLM’s shouldn’t be trusted without double checking the source — same position Wikipedia was in 10 years ago. “Don’t use Wikipedia because it has incorrect information” used to be a major concern, but that seems to have faded away now that Wikipedia has found its place and people understand how to use it. I think a similar thing will happen with LLM’s. That opinion does not take the responsibility away from LLMs to continue working on educating people and reducing hallucinations. I like to think of it as equal responsibility between the LLM provider and user. Like driving a car - the most advanced safety system won’t prevent a bad driver from crashing. | ||
| ▲ | dvrp 3 hours ago | parent [-] | |
We also are working on crowdsourcing methods, but it's hard because almost everyone involved in the development of this project is a volunteer that either works for a company already or is a startup founder (me)... so is very tricky to find time. Also, feel free to check Jwiki (FKA Jikipedia) at https://jmail.world/wiki | ||