▲ | The greatest threat to AI adoption is hallucinations [video](youtube.com) | |
1 points by saltysalt 8 hours ago | 2 comments | ||
▲ | leakycap 8 hours ago | parent | next [-] | |
Personally, I hope AI doesn't ever get to the point you can use its output without checking it. The hallucinations are one easy way I can tell when someone is regurgitating AI. | ||
▲ | allears 8 hours ago | parent | prev [-] | |
So-called "hallucinations" aren't a threat to AI, and they're not hallucinations. That's a marketing term meant to anthropomorphize a statistical machine. They're a mathematical certainty. AI software is simply picking the most statistically likely words and phrases as a response to your prompt, compared to all the other training data it's been fed. There's no agency, no concept of "truth" or "facts." |