How Will the LLM Hallucination Problem Be Solved?
Plus
21
Ṁ6422029
4%
9%
Vector Embeddings (as with Pinecone https://www.pinecone.io/)
1.2%
Filtering (as with Deepmind AlphaCode https://www.deepmind.com/blog/competitive-programming-with-alphacode)
0.1%
Ensemble Combined with Fine Tuning
0.6%
Joint Embedding Predictive Architecture (https://arxiv.org/pdf/2301.08243.pdf)
0.3%
Feed Forward Algorithms (https://www.cs.toronto.edu/~hinton/FFA13.pdf)
19%
Bigger model trained on more data + RL
1.6%
Vigger models + prompt engineering
43%
It won't be
22%
Giving all LLMs access to the internet and databases of scientific papers
By the year 2028, how will the Hallucination Problem have been solved for the vast majority of applications out there?
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@LukeFrymire this would imply that either the options in my market are overvalued or your market is overvalued.
@PatrickDelaney I think the resolution criteria is fairly different. Mine requires that a scale-based solution is possible, yours requires it to be the primary method in production.
@VictorLevoso ught hit v instead of b in keyboard and didn't look at the question properly before clicking submit and now can't edit it or erase it.
Related questions
Related questions
Will LLM hallucinations be a fixed problem by the end of 2028?
50% chance
Will LLM hallucinations be a fixed problem by the end of 2025?
22% chance
LLM Hallucination: Will an LLM score >90% on SimpleQA before 2026?
55% chance
Will scaling current methods be enough to eliminate LLM hallucination?
15% chance
Will an LLM be able to solve a Rubik's Cube by 2025?
69% chance
Will LLM hallucinations be "largely eliminated" by 2025?
10% chance
Will an LLM be able to solve confusing but elementary geometric reasoning problems in 2024? (strict LLM version)
14% chance
Will hallucinations (made up facts) created by LLMs go below 1% on specific corpora before 2025?
41% chance
Will LLMs mostly overcome the Reversal Curse by the end of 2025?
66% chance
Will an agentized LLM cause some chaos?
54% chance