| ▲ | nairboon 5 hours ago | ||||||||||||||||
Nowadays high citation numbers don't mean anymore what they used to. I've seen too many highly cited papers with issues that keep getting referenced, probably because people don't really read the sources anymore and just copy-paste the citations. On my side-project todo list, I have an idea for a scientific service that overlays a "trust" network over the citation graph. Papers that uncritically cite other work that contains well-known issues should get tagged as "potentially tainted". Authors and institutions that accumulate too many of such sketchy works should be labeled equally. Over time this would provide an additional useful signal vs. just raw citation numbers. You could also look for citation rings and tag them. I think that could be quite useful but requires a bit of work. | |||||||||||||||||
| ▲ | mike_hearn a minute ago | parent | next [-] | ||||||||||||||||
I explored this question a bit a few years ago when GPT-3 was brand new. It's tempting to look for technological solutions to social problems. It was COVID so public health papers were the focus. The idea failed a simple sanity check: just going to Google Scholar, doing a generic search and reading randomly selected papers from within the past 15 years or so. It turned out most of them were bogus in some obvious way. A lot of ideas for science reform take as axiomatic that the bad stuff is rare and just needs to be filtered out. Once you engage with some field's literatures in a systematic way, it becomes clear that it's more like searching for diamonds in the rough than filtering out occasional corruption. But at that point you wonder, why bother? There is no alchemical algorithm that can convert intellectual lead into gold. If a field is 90% bogus then it just shouldn't be engaged with at all. | |||||||||||||||||
| ▲ | raddan an hour ago | parent | prev | next [-] | ||||||||||||||||
Interesting idea. How do you distinguish between critical and uncritical citation? It’s also a little thorny—if your related work section is just describing published work (which is a common form of reviewer-proofing), is that a critical or uncritical citation? It seems a little harsh to ding a paper for that. | |||||||||||||||||
| ▲ | elzbardico an hour ago | parent | prev | next [-] | ||||||||||||||||
Those citation rings are becoming rampant in my country, along with the author count inflation. | |||||||||||||||||
| ▲ | boelboel 4 hours ago | parent | prev [-] | ||||||||||||||||
Going to conferences seeing researchers who've built a career doing subpar (sometimes blatantly 'fake') work has made me grow increasingly wary of experts. Worst is lots of people just seem to go along with it. Still I'm skeptical about any sort of system trying to figure out 'trust'. There's too much on the line for researchers/students/... to the point where anything will eventually be gamed. Just too many people trying to get into the system (and getting in is the most important part). | |||||||||||||||||
| |||||||||||||||||