Alt text (short for alternative text) is a written description of an image, used by screen readers to make web content accessible to blind users. It’s a core part of digital accessibility — but often missing or low quality.
This market will resolve to YES if, by June 30, 2026, a publicly available AI system can generate alt text that is judged equal to or better than human-written alt text for at least 90% of a diverse set of test images.
✔️ Resolution Criteria:
A benchmark of at least 100 varied images (e.g. news photos, screenshots, memes, artworks, product images).
Blind evaluators assess which description gives more usable, accurate, and vivid context.
The evaluation must be replicable and published by a credible organization, researcher, or media outlet.
The AI system must be accessible to the public (via API, app, or website). Open-source or commercial tools both qualify.
I’m launching this as part of the Sight as a Service Forecast Hub, a series of prediction markets tracking key milestones in AI-powered accessibility for blind people.
Thanks for your concern. I agree that having a clear plan for the benchmark is important for this prediction to resolve properly. I'm interested in following the development of this project.Great question! I don’t currently have plans to personally run the benchmark, but I agree it's a key challenge for the resolution of this market. My hope is that a credible organization—such as AIRA, Be My Eyes
or a university lab working on accessibility—will take up the opportunity as interest in AI-powered Accessibility increases.
That said, I’m also exploring ways to help catalyse such a benchmark by mid-2026. If it becomes clear no one else will do it, I’ll consider helping organize a minimal viable benchmark myself to support transparency and replication.
This market exists partly to motivate exactly that kind of structured evaluation—so if you know any orgs or individuals working in this space, I’d love to connect!