Will @EliezerYudkowsky reverse his opinion on AI safety, before 2030?
Will @EliezerYudkowsky reverse his opinion on AI safety, before 2030?
37
1kṀ3651
2030
11%
chance

For example, if he decides that actually we should try to build ASI even if it means a great risk to the human race. Or if he decides that the creation of ASI doesn't actually pose a great risk to the human race.

  • Update 2025-02-25 (PST) (AI summary of creator comment): Key Resolution Update:

    • The reversal must include an explicit admission that Yudkowsky was wrong about his previous stance on AI safety.

    • Merely adjusting his perspective (e.g., claiming he was only slightly off or that we just got lucky) will not meet the criteria.

    • The explicit admission is the central and decisive component for a valid resolution.

Get
Ṁ1,000
to start trading!


Sort by:
1y

If he dies before we get AGI or ASI, that would mean he has no way of reversing his opinion, and then this would have to resolve NO, right?

@12c498e yep (probably, although there are some complications, e.g. if he gets cryopreserved)

1y

Does it count if he ceases holding AGI doomerist views because, um, well... "paperclips"?

1y
1y

@jim How about if he's still pretty much entirely himself, but a slightly different version of himself that just looooves paperclips? That is to say:

He gazed up at the enormous face. Forty years it had taken him to learn what kind of smile was hidden beneath the dark moustache. O cruel, needless misunderstanding! O stubborn, self-willed exile from the loving breast! Two gin-scented tears trickled down the sides of his nose. But it was all right, everything was all right, the struggle was finished. He had won the victory over himself. He loved Big Brother.

1y

@dph121 that would be a YES! Brainwashing is A-OK.

bought Ṁ10 YES1y

Does it count if we develop a strong AGI by 2030, it doesn’t lead to doom, and Yudkowsky admits that he was wrong?

1y

@OlegEterevsky yep that would count

1mo

@jim What if he says we just got lucky/he was still mostly right, but just a little too high p(doom)?

1mo

@DavidHiggs the central part is "Yudkowsky admits that he was wrong".

1y

What is this?

What is Manifold?
Manifold is the world's largest social prediction market.
Get accurate real-time odds on politics, tech, sports, and more.
Or create your own play-money betting market on any question you care about.
Are our predictions accurate?
Yes! Manifold is very well calibrated, with forecasts on average within 4 percentage points of the true probability. Our probabilities are created by users buying and selling shares of a market.
In the 2022 US midterm elections, we outperformed all other prediction market platforms and were in line with FiveThirtyEight’s performance. Many people who don't like betting still use Manifold to get reliable news.
ṀWhy use play money?
Mana (Ṁ) is the play-money currency used to bet on Manifold. It cannot be converted to cash. All users start with Ṁ1,000 for free.
Play money means it's much easier for anyone anywhere in the world to get started and try out forecasting without any risk. It also means there's more freedom to create and bet on any type of question.
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules