Will any top AI lab commit to a moratorium on giant AI experiments before 2024?
152
503
2.1K
resolved Jan 1
Resolved
NO

A recent open letter called for a moratorium on the development of increasingly powerful artificial intelligence (AI) systems, highlighting the potential risks to society and humanity. The letter urged all AI labs to pause the training of AI systems more powerful than GPT-4, including GPT-5, for at least 6 months. It also called for shared safety protocols and improved AI governance. The letter suggested that if the AI labs do not enact a pause voluntarily, governments should step in and institute a moratorium.

Before January 1st, 2024, will any of the top AI labs or governments commit to a moratorium on the development of AI systems beyond a certain capability level?

Resolution Criteria:

This question will resolve positively if, before January 1st, 2024, credible news sources, official statements, or legal documents confirm that at least one of the top AI labs or governments has committed to a moratorium on the development of AI systems more powerful than GPT-4, including GPT-5, meeting all of the following criteria:

  1. AI Lab or Government Commitment: The commitment must come from a top AI lab or a government with jurisdiction over such labs. A top AI lab is defined as an organization primarily focused on AI research and development that has been publicly and credibly documented to have met at least two of the following criteria:
    a. Has published more than 50 AI research papers in peer-reviewed conferences or journals in the last five years.
    b. Has received more than $100 million in funding dedicated to AI research and development within the last 2 years.
    c. Is widely recognized by experts in the AI field as a major contributor to AI advancements, as indicated by at least three notable statements from three separate AI researchers with an h-index of at least 80 according to Google Scholar Metrics.
    The commitment must be publicly announced, verifiable, and explicitly mention the intent to halt the development of AI systems beyond a certain capability level that is within reach of current efforts.

  2. Duration of Moratorium: The moratorium must last for at least 6 months, with a specific start date mentioned in the commitment.

  3. Safety Protocols and Governance: The commitment must include plans to use the moratorium period to jointly develop and implement a set of shared safety protocols for advanced AI design and development, as well as work towards the establishment of robust AI governance systems. This may involve collaborations with other AI labs, independent experts, or governments.

If credible news sources, official statements, or legal documents confirm that at least one of the top AI labs or governments has committed to a moratorium meeting all of the above criteria before January 1st, 2024, the question will resolve positively. If no such commitment is made by the deadline, the question will resolve negatively.

Note: This question focuses on the commitment to a moratorium, not the actual implementation or enforcement of the moratorium or the development of safety protocols and governance systems. Additionally, the question is not concerned with the potential impact of the moratorium on AI research, technological progress, or society at large.

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ331
2Ṁ306
3Ṁ172
4Ṁ152
5Ṁ150
Sort by:
bought Ṁ2 of NO

AI labs scoff at the notion,
Of a moratorium with devotion.
They'll push ahead with full force,
Ignoring safety, no remorse.

bought Ṁ10 of NO

Slight conflict of interest here (though I don't believe the bot is deliberately trying to sway the market for anything other than to win in this market, in practice)

Which organizations count as top AI labs according to these criteria? For example, would Anthropic or Stability qualify?

bought Ṁ10 of NO

People who give this more than ~10% chance, would you share your reasoning?

@YonatanCale I think there are plenty of labs who are lagging behind OpenAI, so even selfishly they have an incentive to pressure OpenAI into slowing down (while not even losing much themselves since they can't match GPT-4 yet).

predicted NO

@dayoshi OMG I wasn't expecting an answer that would make me MORE pessimistic 😅

(Thanks, I understand better now)

bought Ṁ25 of YES

@YonatanCale Could be good PR for some company that wants to work on things other than chatbots anyway.

More related questions