Remix.run Logo
anonymous908213 an hour ago

I don't believe there is any greater journalistic malpractice than fabrication. Sure, there are worse cases of such malpractice in the world given the low importance of the topic, but journalists should be reporting the truth on anything they deem important enough to write about. Cutting corners on the truth, of all things, is the greatest dereliction of their duty, and undermines trust in journalism altogether, which in turn undermines our collective society as we no longer work from a shared understanding of reality owing to our inability to trust people who report on it. I've observed that journalists tend to have unbelievably inflated egos and tout themselves as the fourth estate that upholds all of free society, and yet their behaviour does not actually comport with that and is rather actively detrimental in the modern era.

I also do not believe this was a genuine result of incompetence. I entertained that it is possible, but that would be the most charitable view possible, and I don't think the benefit of doubt is earned in this case. They routinely cover LLM stories, the retracted article being about that very subject matter, so I have very little reason to believe they are ignorant about LLM hallucinations. If it were a political journalist or something, I would be more inclined to give the ignorance defense credit, but as it is we have every reason to believe they know what LLMs are and still acted with intention, completely disregarding the duty they owe to their readers to report facts.

maxbond an hour ago | parent [-]

> I don't believe there is any greater journalistic malpractice than fabrication. Sure, there are worse cases of such malpractice...

That's more or less what I mean. It was only a few notches above listicle to begin with. I don't think they intended to fabricate quotes. I think they didn't take the necessary time because it's a low-stakes, low-quality article to begin with. With a short shelf life, so it's only valuable if published quickly.

> I also do not believe this was a genuine result of incompetence.

So your hypothesis is that they intentionally made up quotes that were pretty obviously going to be immediately spotted and damage their career? I don't think you think that, but I don't understand what the alternative you're proposing is.

I also feel compelled to point out you've abandoned your claim that the article was generated. I get that you feel passionately about this, and you're right to be passionate about accuracy, but I think that may be leading you into ad-hoc argumentation rather than more rational appraisal of the facts. I think there's a stronger and more coherent argument for your position that you've not taken the time to flesh out. That isn't really a criticism and it isn't my business, but I do think you ought to be aware of it.

I really want to stress that I don't think you're wrong to feel as you do and the author really did fuck up. I just feel we, as a community in this thread, are imputing things beyond what is in evidence and I'm trying to push back on that.

anonymous908213 an hour ago | parent [-]

What I'm saying is that I believe they do not care about the truth, and intentionally chose to offload their work to LLMs, knowing that LLMs do not produce truth, because it does not matter to them. Is there any indication that this has damaged their career in any way? It seems to me that it's likely they do not care about the truth because Ars Technica does not care about the truth, as long as the disregard isn't so blatant that it causes a PR issue.

> I also feel compelled to point out you've abandoned your claim that the article was generated.

As you've pointed out, neither of us has a crystal ball, and I can't definitively prove the extent of their usage. However, why would I have any reason to believe their LLM usage stops merely at fabricating quotes? I think you are again engaging in the most charitable position possible, for things that I think are probably 98 or 99% likely to be the result of malicious intent. It seems overwhelmingly likely to me that someone who prompts an LLM to source their "facts" would also prompt an LLM to write for them - it doesn't really make sense to be opposed to using an LLM to write on your behalf but not be opposed to it sourcing stories on your behalf. All the more so if your rationale as the author is that the story is unimportant, beneath you, and not worth the time to research.

maxbond an hour ago | parent [-]

> I think you are again engaging in the most charitable position possible, ...

Yeah, that's accurate. I will turn a dime the moment I receive evidence that this was routine for this author or systemic for Ars. But yes, I'm assuming good faith (especially on Ars' part), and that's generally how I operate. I guess I'm an optimist, and I guess I can't ask you to be one.