If I publish a LessWrong post critiquing part of Ajeya Cotra's draft report on AI timelines, will it receive over 50 karma within 72 hours?
15
100Ṁ1695resolved Feb 26
Resolved
YES1H
6H
1D
1W
1M
ALL
I intend to soon publish a critique of one part of Ajeya Cotra's draft report on AI timelines, which can be found here: https://www.lesswrong.com/posts/KrJfoZzpSDpnrv9va/draft-report-on-ai-timelines
**Content of the post**
My critique will focus on the fact that she assumes that hardware progress in the future will resemble hardware progress in the past; more precisely, she assumes that price-performance for GPUs will cut in half roughly every 2.5 years over the next 50 years. I think that price-performance trends are already fairly close to saturating; or at the least, we should expect future progress in the medium-term to be slower than what we observed in the past. This adjustment implies correspondingly longer and more uncertain AI timelines than her implicit bottom line. It also makes her analysis less informative, since a soft assumption of her report is that, fundamentally, "progress in hardware drives progress in AI"; but it seems we're entering an era in which that's no longer as true as it was before.
**Resolution criteria**
This question resolves to YES if AT ANY POINT within 72 hours of posting this critique, it receives more than 50 karma on LessWrong. This question resolves to NO otherwise.
Karma is not equivalent to votes. Users differ in the number of karma they provide a post, through their votes. Strong upvotes also convey more karma than weak upvotes. If I do not post this critique to LessWrong before March 1st, then this question resolves to N/A.
Note: This question will not resolve to NO simply because people coordinate to strong downvote me at the 72 hour mark. Recall: the post must simply exceed 50 karma at any point *prior* to 72 hours after posting for this to resolve YES. If the karma later falls below this threshold, it will still resolve to YES.
Feb 22, 2:41pm: My LessWrong profile can be found here: https://www.lesswrong.com/users/matthew-barnett
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ48 | |
2 | Ṁ25 | |
3 | Ṁ24 | |
4 | Ṁ19 | |
5 | Ṁ5 |
People are also trading
Related questions
Will "Catching AIs red-handed" make the top fifty posts in LessWrong's 2024 Annual Review?
13% chance
Will "Contra papers claiming superhuman AI forecasting" make the top fifty posts in LessWrong's 2024 Annual Review?
12% chance
Will "Shallow review of technical AI safety, 2024" make the top fifty posts in LessWrong's 2024 Annual Review?
27% chance
Will "The Field of AI Alignment: A Postmortem, and ..." make the top fifty posts in LessWrong's 2024 Annual Review?
28% chance
Will "Introducing AI Lab Watch" make the top fifty posts in LessWrong's 2024 Annual Review?
9% chance
Will "A deep critique of AI 2027’s bad timeline models" make the top fifty posts in LessWrong's 2025 Annual Review?
18% chance
Will "AIs Will Increasingly Attempt Shenanigans" make the top fifty posts in LessWrong's 2024 Annual Review?
12% chance
Will "Many arguments for AI x-risk are wrong" make the top fifty posts in LessWrong's 2024 Annual Review?
22% chance
Will "AI 2027: Responses" make the top fifty posts in LessWrong's 2025 Annual Review?
16% chance
Will "The Case Against AI Control Research" make the top fifty posts in LessWrong's 2025 Annual Review?
26% chance