Remix.run Logo
billypilgrim 11 hours ago

I must say I expected an actual poisoning of the data used to train the LLM and was excited, but the examples indicate that the LLM just searched the web and reported what it found? When you create a website with fake information and search Google for that information, it will of course bring up your site, not because it’s factually correct but because it’s related to what you searched for. What am I missing?

rincebrain 10 hours ago | parent [-]

The part where lots of people have historically trusted LLM responses without verification, more than trying to sort through the dross on Google or Bing search results is, I think, the point.

_thisdot 2 hours ago | parent [-]

The problem with this specific instance is that if you asked someone to find out who won this championship without using an LLM, they’d reach the same answer. I’d be much more impressed if someone managed to poison an LLM into answering that US won the 2023 World Cup