Remix.run Logo
InsideOutSanta 5 hours ago

I'm scared that this type of thing is going to do to science journals what AI-generated bug reports is doing to bug bounties. We're truly living in a post-scarcity society now, except that the thing we have an abundance of is garbage, and it's drowning out everything of value.

willturman 4 hours ago | parent | next [-]

In a corollary to Sturgeon's Law, I'd propose Altman's Law: "In the Age of AI, 99.999...% of everything is crap"

SimianSci 4 hours ago | parent [-]

Altman's Law: 99% of all content is slop

I can get behind this. This assumes a tool will need to be made to help determine the 1% that isn't slop. At which point I assume we will have reinvented web search once more.

Has anyone looked at reviving PageRank?

Imustaskforhelp 3 hours ago | parent [-]

I mean Kagi is probably the PageRank revival we are talking about.

I have heard from people here that Kagi can help remove slop from searches so I guess yeah.

Although I guess I am DDG user and I love using DDG as well because its free as well but I can see how for some price can be a non issue and they might like kagi more.

So Kagi / DDG (Duckduckgo) yeah.

jll29 3 hours ago | parent [-]

Does anyone have kept an eye of who uses what back-end?

DDG used to be meta-search on top of Yahoo, which doesn't exist anymore. What do Gabriel and co-workers use now?

selectodude 3 hours ago | parent | next [-]

I think they all use Bing now.

direwolf20 37 minutes ago | parent | prev [-]

Kagi is mostly stealing results from Google and disenshittifying them but mixes in other engines like Yandex and Mojeek and Bing.

DDG is Bing.

techblueberry 5 hours ago | parent | prev | next [-]

There's this thing where all the thought leaders in software engineering ask "What will change about building about building a business when code is free" and while, there are some cool things, I've also thought, like it could have some pretty serious negative externalities? I think this question is going to become big everywhere - business, science, etc. which is like - Ok, you have all this stuff, but do is it valuable? Which of it actually takes away value?

SequoiaHope 3 hours ago | parent [-]

To be fair, the question “what will change” does not presume the changes will be positive. I think it’s the right question to ask, because change is coming whether we like it or not. While we do have agency, there are large forces at play which impact how certain things will play out.

jplusequalt 5 hours ago | parent | prev | next [-]

Digital pollution.

jcranmer 5 hours ago | parent | prev | next [-]

The first casualty of LLMs was the slush pile--the unsolicited submission pile for publishers. We've since seen bug bounty programs and open source repositories buckle under the load of AI-generated contributions. And all of these have the same underlying issue: the LLM makes it easy to do things that don't immediately look like garbage, which makes the volume of submission skyrocket while the time-to-reject also goes up slightly because it passes the first (but only the first) absolute garbage filter.

storystarling 3 hours ago | parent [-]

I run a small print-on-demand platform and this is exactly what we're seeing. The submissions used to be easy to filter with basic heuristics or cheap classifiers, but now the grammar and structure are technically perfect. The problem is that running a stronger model to detect the semantic drift or hallucinations costs more than the potential margin on the book. We're pretty much back to manual review which destroys the unit economics.

direwolf20 36 minutes ago | parent | next [-]

If it's print-on-demand, why does it matter? Why shouldn't you accept someone's money to print slop for them?

lupire an hour ago | parent | prev [-]

Why would detecting AI be more expensive than creating it?

jll29 3 hours ago | parent | prev [-]

Soon, poor people will talk to a LLM, rich people will get human medical care.

Spivak 3 hours ago | parent [-]

I mean I'm currently getting "expensive" medical care and the doctors are still all using AI scribes. I wouldn't assume there would be a gap in anything other than perception. I imagine doctors that cater to the fuck you rich will just put more effort into hiding it.

No one, at all levels, wants to do notes.

golem14 33 minutes ago | parent [-]

My experience has been that the transcriptions are way more detailed and correct when doctors use these scribes.

You could argue that not writing down everything provides a greater signal-noise ratio. Fair enough, but if something seemingly inconsequential is not noted and something is missed, that could worsen medical care.

I'm not sure how this affects malpractice claims - It's now easier to prove (with notes) that the doc "knew" about some detail that would otherwise not have been note down.