Remix.run Logo
wrs 3 hours ago

A "mistake" would be a typo in a real citation. A hallucinated citation is evidence of just plain laziness and negligence, which taints the entire submission.

dataflow 2 hours ago | parent [-]

No it is not. Seriously. All you need for this to happen is for your lab partner to ask AI to add a missing citation that they are already familiar with at the last minute before a midnight submission deadline, and for the AI to hallucinate something else, and for them to honestly miss this. It does not even imply any involvement on your part, let alone that either of you were lazy or negligent on the actual research or substance of the paper. The lack of any sympathy or imagination here is astounding.

asdff 2 hours ago | parent | next [-]

There are no deadlines for journal submissions. Even if you felt you were running close to your revisions being due, an email to an editor will probably fix this for you. And what you described is still negligent, not verifying the garbage output bot did not in fact output garbage.

AnimalMuppet 18 minutes ago | parent [-]

Even more, there are no deadlines for arXiv submissions.

wrs 20 minutes ago | parent | prev | next [-]

You’re confusing the issue here by saying it’s not your fault, it’s your lab partner’s. We’re talking about why your lab partner did something wrong. You can assign blame for the wrong thing separately.

The citation is part of the substance of the paper. If you YOLOed in a citation without checking it, seems justified to suspect that you may have YOLOed in some data, or some analysis, or maybe even the conclusion.

applfanboysbgon an hour ago | parent | prev | next [-]

Your constructed hypothetical makes it even worse. If there are 2+ people in this scenario who have good intentions, this should especially never happen. When you sign your name on a paper, you are nonetheless vouching for everything written in it, including the things you didn't personally write. You should absolutely be checking every single reference your co-author included and verifying that it says what your co-author claims it says. This is something you should have been doing completely independent of LLMs existing. This is something you're publishing publicly, something that may be associated with you and your career for the rest of your life, it is insanely negligent to not even read and verify what your co-author is adding.

bigstrat2003 2 hours ago | parent | prev [-]

The lack of understanding that you are responsible for the content you create, no matter what tools you use, is what's astounding.