Right now it is quite slow to update, and it's faster to query Supabase or scrape the web interface.
This market resolves to YES if there's no longer a significant (speed-based) reason to do any of those things, and we can just use the API.
Right now I can get e.g. 100 markets in one call from supabase in 1s, but they take about ~40s total with the API, getting one at a time. Here's a script to demonstrate this:
API: got 100 markets in 37.4s
supabase: got 100 markets in 0.8s
https://gist.github.com/chrisjbillington/fa4bb2ef169c05d5b625ab3ab899aafc
You can get max ~300 markets at once this way with supabase before the URL with params gets too long and the server complains.
Of course I could have my code asynchronously issue a hundred or whatever API requests at once before waiting responses, that might work. Not sure when I'd run afoul of rate limits. And the code for that is a bit more annoying. But that's what I'd try if I couldn't use supabase anymore.
Edit: this is speed-related in the sense of how long it takes my code to run, but unrelated to the API data being out of date, that's a whole other thing.