| ▲ | jayd16 7 hours ago | |
The situation is different. Those sources are people. This is a calculator AND we have the opportunity to fix it. | ||
| ▲ | pixl97 7 hours ago | parent | next [-] | |
Less different than you might expect. For the same reason the things listed above are popular may be the reason why the most popular LLM ends up not being the best. People don't tend to buy good things, they very commonly buy the most shiny ones. An LLM that says "you're right" sure seems a lot more shiny than one that says "Mr. Jayd16, what you've just said is one of the most insanely idiotic things I have ever heard... Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul" | ||
| ▲ | casey2 6 hours ago | parent | prev [-] | |
Political parties, social networks, religions. these are all engineered systems. All of them including AI involve people. For starts nobody is going to do the massive amount of work to train a useless AI that is skeptical and cynical. Imaginination, Agreeability (which causes hallucinations) is a feature, not a bug. In humans and in LLMs. | ||