Remix.run Logo
neuroelectron 5 days ago

Kind of too late for this. The ground truth of models has already been established. That' why we see models converging. They will automatically reject this kind of poison.

nine_k 5 days ago | parent | next [-]

This will remain so as long as the models don't need to ingest any new information. If most novel texts will appear with slightly more insidious nonsense mirrors, LLMs would either have to stay without this knowledge, or start respecting "nofollow".

sevensor 4 days ago | parent | prev | next [-]

I don’t know about that. Have you seen their output? They’re poisoning their own well with specious nonsense text.

blagie 5 days ago | parent | prev [-]

It's competition. Poison increases in toxicity over time.

I could generate subtly wrong information on the internet LLMs would continue to swallow up.

latexr 5 days ago | parent | next [-]

> I could generate subtly wrong information on the internet

There’s already a website for that. It’s called Reddit.

wewtyflakes 4 days ago | parent | prev [-]

Yes, but so would people, so what's the point of this unless you just dislike everybody (and if so, that's fair too I suppose)?

blagie 3 days ago | parent [-]

The whole design was based on things to make pages not discoverable except to sketchy AI companies who violate web norms.