Will LLM hallucinations be down to human-expert rate within months?
Basic
34
Ṁ7311
resolved Feb 2
Resolved
NO

Reid Hoffman said:

And there’s a whole bunch of very good R&D on how to massively reduce hallucinations [AI-generated inaccuracies] and get more factuality. Microsoft has been working on that pretty assiduously from last summer, as has Google. It is a solvable problem. I would bet you any sum of money you can get the hallucinations right down into the line of human-expert rate within months. So I’m not really that worried about that problem overall.

Market resolves on 1/31/2024 (bit over 3 months from now) to the result of a publicized bet made by Reid Hoffman, or my discretion if no such bet is made.

Get
Ṁ1,000
and
S1.00
Sort by:

What will be used as the bar for 'human expert rate'?

Arb for all of 2024:

@Joshua and for longer timelines