| ▲ | anonymous908213 2 hours ago | |
What I'm saying is that I believe they do not care about the truth, and intentionally chose to offload their work to LLMs, knowing that LLMs do not produce truth, because it does not matter to them. Is there any indication that this has damaged their career in any way? It seems to me that it's likely they do not care about the truth because Ars Technica does not care about the truth, as long as the disregard isn't so blatant that it causes a PR issue. > I also feel compelled to point out you've abandoned your claim that the article was generated. As you've pointed out, neither of us has a crystal ball, and I can't definitively prove the extent of their usage. However, why would I have any reason to believe their LLM usage stops merely at fabricating quotes? I think you are again engaging in the most charitable position possible, for things that I think are probably 98 or 99% likely to be the result of malicious intent. It seems overwhelmingly likely to me that someone who prompts an LLM to source their "facts" would also prompt an LLM to write for them - it doesn't really make sense to be opposed to using an LLM to write on your behalf but not be opposed to it sourcing stories on your behalf. All the more so if your rationale as the author is that the story is unimportant, beneath you, and not worth the time to research. | ||
| ▲ | maxbond 2 hours ago | parent [-] | |
> I think you are again engaging in the most charitable position possible, ... Yeah, that's accurate. I will turn a dime the moment I receive evidence that this was routine for this author or systemic for Ars. But yes, I'm assuming good faith (especially on Ars' part), and that's generally how I operate. I guess I'm an optimist, and I guess I can't ask you to be one. | ||