
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ252 | |
2 | Ṁ32 | |
3 | Ṁ29 | |
4 | Ṁ21 | |
5 | Ṁ14 |
People are also trading
From NEJM:
Peer Review & Publication Process
NEJM AI editors will review the paper
[...]
Once a manuscript has been peer-reviewed, the NEJM AI editors make the decision to either reject the submission or move the manuscript forward towards publication
Just kidding. The journal is called "NEJM AI", so "NEJM AI editors" are (presumably) humans.
Anyway, publishers have adopted policies pertaining to AI use in peer review, such as this from Nature:
[...] For these reasons we ask that, while Springer Nature explores providing our peer reviewers with access to safe AI tools, peer reviewers do not upload manuscripts into generative AI tools.
But this is policy not to use AI, not policy to use AI. I don't see anything that is explicitly policy to use AI.
Creator is inactive, resolving NO.
Reviewers offloading work to LLMs starts to become a problem:
https://academia.stackexchange.com/questions/204370/what-should-i-do-if-i-suspect-one-of-the-journal-reviews-i-got-is-ai-generated
I think there is a decent chance that there is some kind of basic statistical error detection that uses AI in place in the next couple of years. Doubt this will supplant traditional review, but I think it should qualify as AI peer review. There are already automated systems in various psych journals for spotting basic errors (not AI powered, afaik).