Remix.run Logo
add-sub-mul-div 2 days ago

1. Your chatbot doesn't have its own internet scale search index.

2. You're being given information that may or may not be coming in part from junk sites. All you've done is give up the agency to look at sources and decide for yourself which ones are legitimate.

n1xis10t 2 days ago | parent | next [-]

As for point one, is that true? I thought ChatGPT and Perplexity had their own indexes.

aunty_helen a day ago | parent | prev | next [-]

I’m quite happy trading off the agency of wading through trash to an LLM. In fact, I would say that’s something they’re pretty good at.

lcnPylGDnU4H9OF 16 hours ago | parent | next [-]

> look at sources and decide ... which ones are legitimate

> I would say that’s something they’re pretty good at.

Lol. Lmao, even.

Seriously, LLMs are famously terrible at this. It's the entire problem behind prompt injection.

https://en.wikipedia.org/wiki/Prompt_injection

They're really good at... ingesting the trash. Yeah, that's pretty much their whole purpose. But understanding it as trash? Not even close. LLMs don't have taste. As another commenter wrote, it's just regurgitating it back.

what a day ago | parent | prev [-]

It’s just regurgitating the same trash to you though.

a day ago | parent | prev [-]
[deleted]