In what year will AI achieve a gold medal on the International Mathematical Olympiad?
2
500Ṁ30
2035
May 13, 2029
26%
2025-2026
27%
2027-2028
16%
2029-2030
16%
2031-2032
16%
2033-2034

Background

The International Mathematical Olympiad (IMO) is a global high‑school mathematics competition, featuring six proof‑based problems over two days—totaling 42 points. Earning a gold medal requires roughly ≥ 85th percentile, often translating to ≥ 75% score (≈ 32/42 points).

MathArena is a rigorous open platform that evaluates LLMs on brand‑new competitions—including IMO 2025—immediately post‑release. It uses best‑of‑32 sampling, anonymized grading by expert IMO‑level judges, and publishes transparent leaderboards. In IMO 2025, the best LLM (Gemini 2.5 Pro) achieved only ~31% (~13/42), well below bronze level and far from gold.

Resolution Criteria

This question resolves once all of the following conditions are met:

  1. Performance Threshold
    AI must score ≥ 32/42 points, which is ≥ 75% and corresponds to approximate gold‑medal performance on the IMO.

  2. Public Verification
    Confirmed by either of the following:

    • A public MathArena leaderboard entry (with certified score ≥ 75%), or

    • A peer‑reviewed or well‑documented report (e.g., on arXiv or a reputable science journal).

  3. System Autonomy
    AI uses only automated inference and sampling; no hidden human guidance in solving or edits to the solutions.

  4. Timing Constraint
    The evaluation must target an IMO edition after the system’s model/public release and be publicly reported within the same calendar year.

  5. Expiration

    If the year is Jan 1, 2035 and AI has not won gold medal in the IMO then the question will be resolved "Not Applicable."

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy