By 2024, a significant fraction of philosophers (>20%) take seriously the notion that language models with a size and architecture similar to GPT-3 are partially conscious
23
Ṁ520Ṁ1kresolved Jan 2
Resolved
NO1H
6H
1D
1W
1M
ALL
Will be judged extremely subjectively based on where the industry discussion is at and informed by any relevant surveys. I will try to be as fair as possible.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ52 | |
| 2 | Ṁ15 | |
| 3 | Ṁ14 | |
| 4 | Ṁ14 | |
| 5 | Ṁ10 |
People are also trading
Will we fully interpret a GPT-2 level language model by 2028?
13% chance
Will certain contemporary publicly available GPT models be generally accepted as conscious at inference time by 2100?
31% chance
Will LLMs such as GPT-4 be seen as at most just a part of the solution to AGI? (Gary Marcus GPT-4 prediction #7)
86% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
25% chance
Will multiple credible sources claim GPT-5 is sentient?
11% chance
Sort by:
@IsaacKing Significant would here be >20% as stated in the question. Right now I would estimate that it's <1%.
People are also trading
Related questions
Will we fully interpret a GPT-2 level language model by 2028?
13% chance
Will certain contemporary publicly available GPT models be generally accepted as conscious at inference time by 2100?
31% chance
Will LLMs such as GPT-4 be seen as at most just a part of the solution to AGI? (Gary Marcus GPT-4 prediction #7)
86% chance
By 2030, will large language models still be at the peak of AI? [DRAFT]
25% chance
Will multiple credible sources claim GPT-5 is sentient?
11% chance
