Like, what's the point?
You cite stuff because you literally talk about it in the paper. The expectation is that you read that and that it has influenced your work.
As someone who's been a researcher in the past, with 3 papers published in high impact journals (in chemistry), I'm beyond appalled.
Let me explain how scientific publishing works to people out of the loop:
- science is an insanely huge domain. Basically as soon as you drift in any topic the number of reviewers with the capability to understand what you're talking about drops quickly to near zero. Want to speak about properties of helicoidal peptides in the context of electricity transmission? Small club. Want to talk about some advanced math involving fourier transforms in the context of ml? Bigger, but still small club. When I mean small, I mean less than a dozen people on the planet likely less with the expertise to properly judge. It doesn't matter what the topic is, at elite level required to really understand what's going on and catch errors or bs, it's very small clubs.
2. The people in those small clubs are already stretched thin. Virtually all of them run labs so they are already bogged down following their own research, fundraising, and coping with teaching duties (which they generally despise, very few good scientist are barely more than mediocre professors and have already huge backlogs).
3. With AI this is a disaster. If having to review slop for your bs internal tool at your software job was already bad, imagine having to review slop in highly technical scientific papers.
4. The good? People pushing slop, due to these clubs being relatively small, will quickly find their academic opportunities even more limited. So the incentives for proper work are hopefully there. But if asian researchers (yes, no offense), were already spamming half the world papers with cheated slop (non reproducible experiments) in the desperate bid of publishing before, I can't imagine now.
Hmm, I follow the argument, but it's inconsistent with your assertion that there is going to be incentive for 'proper work' over time. Anecdotally, I think the median quality of papers from middle- and top-tier Chinese universities is improving (your comment about 'asian researchers' ignores that Japan, South Korea, and Taiwan have established research programs at least in biology).
The urge to cheat in order to get a job, promotion, approval. The urge to do stuff you are not even interested in, to look good in the resume. And to some extent I feel sorry for these people. At the end of the day you have to pay your bills.
All those people can go work for private companies, but few as scientists rather than technicians or QAs.
I've noticed this already with Claude. Claude is so good at code and technical questions... but frankly it's unimpressive at nearly anything else I have asked it to do. Anthropic would probably be better off putting all of their eggs in that one basket that they are good at.
All the more reason that the quest for AGI is a pipe dream. The future is going to be very divergent AI/LLM applications - each marketed and developed around a specific target audience, and priced respectively according to value.
This is all pageantry.
"I know nothing but had an idea and did some work. I have no clue whether this question has been explored or settled one way or another. But here's my new paper claiming to be an incremental improvement on... whatever the previous state of understanding was. I wouldn't know, I haven't read up on it yet. Too many papers to write."
We removed the authorship of a a former co-author on a paper I'm on because his workflow was essentially this--with AI generated text--and a not-insignificant amount of straight-up plagiarism.
Didn't even open a single one of the papers to look at them! Just said that one is not relevant without even opening it.
At the end of the day, it's all about the incentives. Can we have a world where we incentivize finding the truth rather than just publishing and getting citations?
I thought this was introduced by the NSA some time ago.
Fuck A.I. and the collaborators creating it. They've sold out the human race.