Will an LLM better than gpt3.5 run on my rtx 3090 before 2025?
Plus
21
Ṁ997Jan 1
92%
chance
1D
1W
1M
ALL
Inspired by these questions:
/sylv/an-llm-as-capable-as-gpt4-will-run-f290970e1a03
/sylv/an-llm-as-capable-as-gpt4-runs-on-o
Resolution criteria (provisional):
Same as /singer/will-an-llm-better-than-gpt4-run-on, but replace "gpt4" with "gpt3.5".
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@ProjectVictory I noticed that tie, yeah. I'm not sure how to deal with the case of quantized models. EDIT: see below
@ProjectVictory
This is what I'm thinking of doing:
For a quantized model to be eligible, it cannot differ more than 2% from the original model's score on the Winograd Schema Challenge.
Related questions
Related questions
Will an LLM better than gpt4 run on my rtx 3090 before 2025?
7% chance
an LLM as capable as GPT-4 runs on one 3090 by March 2025
30% chance
an LLM as capable as GPT-4 runs on one 4090 by March 2025
31% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
68% chance
Will there be an open source LLM as good as GPT4 by June 2024?
18% chance
Will a 15 billion parameter LLM match or outperform GPT4 in 2024?
24% chance
Will a Mamba-based LLM of GPT 3.5 quality or greater be open sourced in 2024?
79% chance
Which next-gen frontier LLMs will be released before GPT-5? (2025)
Will an open-source LLM beat or match GPT-4 by the end of 2024?
81% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
53% chance