
an LLM as capable as GPT-4 runs on one 4090 by March 2025
13
1kṀ845Mar 2
35%
chance
1H
6H
1D
1W
1M
ALL
e.g. Winograde >= 87.5%
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
does it count that I can run the llm while also using cpu ram offloading just like ollama does automatically? (it would be very slow, but would work)
People are also trading
Related questions
China will make a LLM approximately as good or better than GPT4 before 2025
89% chance
Will xAI develop a more capable LLM than GPT-5 before 2026
68% chance
Size of smallest open-source LLM marching GPT 3.5's performance in 2025? (GB)
1.83
When will an open-source LLM be released with a better performance than GPT-4?
Will an open-source fully functional Auto-GPT like LLM exist by the end of 2025?
90% chance
GPT-4 performance and compute efficiency from a simple architecture before 2026
19% chance