Remix.run Logo
cactusplant7374 4 hours ago

Unless you've discovered the secret sauce, LLM comments are very obvious. Even Altman revealed that they focused on coding at the expense of writing.

kube-system 3 hours ago | parent | next [-]

With the current batch of SOTA models, it is not hard to prompt a model to pass the sniff test on social media forums. If you don't believe me, try it.

All you really need to do is give it some guidelines of a style to follow and styles to avoid. There's also a bunch of skills people have already written to accomplish this.

dgellow 4 hours ago | parent | prev | next [-]

The obvious ones are the ones you notice

cactusplant7374 3 hours ago | parent [-]

LLMs are not good at writing. If they were we would have entire libraries of new, amazing literature.

Tanoc 3 hours ago | parent | next [-]

Exactly, they aren't good at creating new material. But many discussions in comment section are simply regurgitations of existing material, which they are good at rearranging. New novel discussions in places like this are actually a very rare thing, as many comment sections are simply people who already know informing those who don't. I'm doing that right now, funnily enough.

romanhn 3 hours ago | parent | prev | next [-]

Neither are most humans

mrhottakes 3 hours ago | parent [-]

Agreed, some humans are good writers, and no LLMs are good writers.

dwringer 3 hours ago | parent | prev [-]

This is rather moving the goalposts from "plausibly human comment" to "meaningful literature", I think

cactusplant7374 3 hours ago | parent [-]

No. I'm drawing it out to its logical conclusion.

mr_toad an hour ago | parent [-]

It’s poor logic, a non sequitur. An absurd reduction. By your argument anyone who hasn’t written a great literary work is a poor writer, and would be bad at writing online comments.

LLMs aren’t lacking in the sort of writing skills that make for superficially good content. They know grammar, they know rhetoric, and they know their audience. You can’t tell them from a human on their writing skills. Where they tend to fall down is their logic and reasoning skills, and unfortunately it seems you can’t use that to distinguish them from the average online opinionator either.

cactusplant7374 an hour ago | parent [-]

No, that is a mischaracterization of what I wrote. They are great writers if you enjoy formulaic writing.

3 hours ago | parent | prev | next [-]
[deleted]
3 hours ago | parent | prev | next [-]
[deleted]
carlgreene 3 hours ago | parent | prev | next [-]

I have worked with LLMs for a couple years at a very non-technical level and it was not that difficult to give it proper prompting and reference material.

If you are reading LLM content just about everywhere and have no idea. Obviously there are easy to spot things, but the stuff you don't spot is the stuff you don't spot

crooked-v 3 hours ago | parent | prev | next [-]

[flagged]

potsandpans 4 hours ago | parent | prev [-]

People that like to fancy themselves as good llm content detectors just end up accusing everything they don't like as llm content.

The only thing worst than a slop comment are the people that bitch about it incessantly. I'm convinced it's become a new expression of a mental illness.

bee_rider 3 hours ago | parent | next [-]

The main thing I suspect of being LLM written is the sort of LinkedIn style: very short sentences, overly focused on sort of… making an impact on the user. But that’s also how a certain type of bad human writer writes. So in the end, I’m not sure I know if anything in particular was written by an LLM.

I guess… “that’s not just an AI red flag, it’s generally shit prose” would be how ChatGPT would describe most things nowadays.

transcriptase 3 hours ago | parent [-]

It’s the distilled mediocrity of the statements. Never venturing beyond a 10% margin of what you would get if you sampled the opinions of 1,000 people who underwent jury selection by west coast liberals.

cactusplant7374 3 hours ago | parent | prev [-]

A mere opinion is not mental illness.

potsandpans 3 hours ago | parent [-]

I wasn't suggesting you have a mental illness for having an opinion.

More, commenting that just as bad as generated content if not worse is every thread where the top comment is an accusation and ensuing witch hunt.

So, no, having an opinion is not a mental illness. Feeling compelled to call it out and discuss it on everything one reads may just be.

fwip 3 hours ago | parent [-]

The threads that have the top comment saying "this is AI slop" are nearly always about an article that is obvious AI slop.

Threads that aren't - like this one - don't.

potsandpans 3 hours ago | parent [-]

If you need to tell yourself that in order to cope that's fine with me.

layer8 2 hours ago | parent [-]

I’m thinking that I may actually prefer undetectable AI slop to human comments like that. I do agree with your upthread comments.