Remix.run Logo
furyofantares 4 days ago

> Earlier LLMs used to be a goldmine for company secrets (when it learned documents that shouldn't be on public internet).

Sounds fake. LLMs don't usually memorize things that appear once in their training set anyway, nor have I heard about major issues accidentally training on a bunch of non-public data.

I can see how someone would believe it to be true though, since LLMs can easily hallucinate in a way that looks like this is true.