Open Philanthropy has a nifty online tool that helps one to properly calibrate their forecasts. As part of the tool, the site graphs your results with prediction accuracy on the Y axis and confidence interval on the X axis.
I would like to see a similar graph for my resolved Manifold predictions (preferably with the option of cutting the data set to exclude my own markets or markets with low # of traders at the time of prediction, etc.). This would help me know whether I'm typically underconfident or overconfident at different confidence thresholds.
https://www.openphilanthropy.org/calibration
Related questions
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ70 | |
2 | Ṁ30 | |
3 | Ṁ14 | |
4 | Ṁ8 | |
5 | Ṁ4 |
We also generate calibration curves in https://www.quantifiedintuitions.org/pastcasting and https://www.quantifiedintuitions.org/calibration :)
Manifold in the wild: A Tweet by Carson Gale
I would love for @ManifoldMarkets to provide users with a 'calibration graph' similar to what OpenPhilanthropy provides in their calibration practice tool. https://manifold.markets/CarsonGale/will-manifold-provide-users-with-a
Manifold in the wild: A Tweet by Carson Gale
I would love for @ManifoldMarkets to provide users with a 'calibration graph' similar to what OpenPhilanthropy provides in their calibration practice tool. https://manifold.markets/CarsonGale/will-manifold-provide-users-with-a