Remix.run Logo
erentz 5 hours ago

Seems AI has made it cheap to produce information but now you have to spend more time parsing the information. And it’s now the less competent/useful people spending less time producing more information with the more useful people spending more of their valuable time parsing that information. This is why I’m skeptical of LLMs ever becoming a net benefit in most organizations.

anonymars an hour ago | parent | next [-]

Intellectual denial of service

scruple 3 hours ago | parent | prev | next [-]

LLMs are Brandolini's Law taken to an entirely different plane of existence.

an hour ago | parent [-]
[deleted]
jimbokun 2 hours ago | parent | prev | next [-]

Calling it “information“is generous.

trollbridge 4 hours ago | parent | prev [-]

Well, you can use LLMs to parse LLM-generated slop. They make nice summaries. I have taken this approach to people who send me obviously generated LLM text; I simply run it through an LLM, paste the summary, and ask them "Is this an accurate summary?" and then I ask the for their original prompt.

dodu_ an hour ago | parent | next [-]

Ah yes, take my single sentence, blow it up to 3 paragraphs with LLMs, and then the person reading it can have an LLM summarize it in a single sentence.

What the fuck are we even doing anymore?

ua709 a minute ago | parent [-]

I wonder if that even works. Kinda like when kids play telephone I think it’s unlikely the input and output sentences actually match.

stoorafa 2 hours ago | parent | prev | next [-]

LLMs are great at decompression [1]

[1] https://jabde.com/2026/02/02/utilizing-llms-as-a-data-decomp...

Sgt_Apone 3 hours ago | parent | prev | next [-]

Might as well donate money to the AI companies at this point.

erentz 3 hours ago | parent | prev [-]

But now even this is just producing more information and requires more work both of you and of the original sender.