| ▲ | pmontra 4 hours ago | |
It is not LLM specific. The conclusion of the post states > The web was already being poisoned for search and link ranking long before LLMs existed. But it continues > We are now plugging generative models directly into that poisoned pipeline and asking them to reason confidently about “truth” on our behalf. So it's a shift from trust Google to trust the AI, which might be more insidious or not, depends on the individual attitude of each of us. | ||
| ▲ | bambax an hour ago | parent [-] | |
It's a shift but it's a little worse. Checking/auditing search results is easier and more ingrained; even if many people don't do it, everyone has been hit by spam at some point, everyone knows it exists. LLMs are the same thing but have an air of authority about them that a web search lacks, at least for now. | ||