Will LLaMA be the best open LLM? (2023)
11
50
αΉ€210
resolved May 5
Resolved
NO

Get αΉ€200 play money

πŸ… Top traders

#NameTotal profit
1αΉ€32
2αΉ€27
3αΉ€12
4αΉ€6
5αΉ€2
Sort by:
predicted NO

β€œnearly all of the training budget was spent on the base MPT-7B model, which took ~9.5 days to train on 440xA100-40GB GPUs, and cost ~$200k”

predicted NO

@Gigacasting

We wanted our model to be capable of code generation

This sounds amazing, the other open models are terrible at this. I'm glad I bet in your market so I get notified when you find stuff.

bought αΉ€65 of NO
predicted YES

Presumably includes fine-tuned versions of LLaMA?

sold αΉ€1 of NO

Wild year if something better leaks

predicted NO

PIQA/Winogrande et al

define best?