Will LLaMA be the best open LLM? (2023)
11
210αΉ€413
resolved May 5
Resolved
NO

Get
αΉ€1,000
to start trading!

πŸ… Top traders

#NameTotal profit
1αΉ€32
2αΉ€27
3αΉ€12
4αΉ€6
5αΉ€2
Sort by:
predictedNO

β€œnearly all of the training budget was spent on the base MPT-7B model, which took ~9.5 days to train on 440xA100-40GB GPUs, and cost ~$200k”

predictedNO

@Gigacasting

We wanted our model to be capable of code generation

This sounds amazing, the other open models are terrible at this. I'm glad I bet in your market so I get notified when you find stuff.

predictedYES

Presumably includes fine-tuned versions of LLaMA?

Wild year if something better leaks

predictedNO

PIQA/Winogrande et al

define best?

Β© Manifold Markets, Inc.β€’Terms + Mana-only Termsβ€’Privacyβ€’Rules