
on a similar scale to how in 2023, videogame addiction and social media addictions are a thing.
it might get folded into the existing conditions. i'm looking for "will AI-generated content be at least as bad an addiction for comparable number of people as videogames or social media currently are".
if it exists but i judge it as <10x as bad, it doesn't count. above that, it does count.
proxies:
- published papers discussing addictiveness of social media/videogames
- qualitative/quantitative data points showing it's a problem on a similar or larger scale
- case reports
resolution will be at least partially subjective. i will not trade on this market.
🏅 Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | Ṁ723 | |
| 2 | Ṁ205 | |
| 3 | Ṁ177 | |
| 4 | Ṁ176 | |
| 5 | Ṁ165 |
People are also trading
Alright, it says it's subjective so here goes:
- published papers discussing addictiveness of social media/videogames
Paper link to arXiv:
This sounds like an almost perfect match for the market criteria. But it appears to be a preprint or not yet published work. Seems like we can't use it.
- case reports
I have seen several instances of "AI addiction" in various types of reporting but I have not seen 10x more of those than I saw of other types of addiction in the past.
- qualitative/quantitative data points showing it's a problem on a similar or larger scale
Couldn't find anything yet.
---
What it comes down to for me is this:
if it exists but i judge it as <10x as bad, it doesn't count. above that, it does count.
That is a super high bar. I judge it as <10x as bad as of the end of 2025. Resolving No on behalf of the disappeared creator.
My read is the creator would say "It's totally picking up steam and is a real thing at the end of 2025 but is not 10x as bad as video games yet".
@agentydragon if you show up again you can certainly re-resolve this.
I assume how widespread the phenomenon is factors into its "severity"? It took 10+ years from the time that video games and social media reached their modern forms for these kinds of addictions to be written about widely. So this market will only resolve yes if the same thing happens with AI in ~1/5th the time?
@jonsimon Basically I'm trying to understand what happens if by the time the market resolves, these problems aren't especially bad or widespread, but we're clearly on a trajectory where they will be within a few more years.
@jonsimon If the trajectory is not yet Bad but is looking like Bad in the "near future"TM I'd probably also resolve Yes.
Thinking about this, this very much depends on what is the definition of "AI-generated content" and whether AI needs to be somehow fundamental to the addiction. Example 1: it is completely plausible, some company will use AI to augment their online gambling platform - the core mechanic causing addiction will be gambling, and AI will add some additional flair. Example 2: Somebody uses AI to generate pornography. Pornography addiction seems to be a thing, so plausibly somebody will get addicted to this form of pornography. So the addiction is fully to AI generated content, but it is unlikely the effect differs substantially from "normal" pornography addiction.
Will this resolve yes in such case? Or does it need to be the case that AI increased the addictivity of those "products"/created a new type of content to be addicted to?
@MartinModrak AI-generated porn is (mostly) what I'm betting on. I do expect this to increase the severity of porn addictions.
good questions. i'm not sure what should be the best resolution criteria to make it binary to answer. i'm interested in ideas for how to better delineate.
definitely yes: addiction to some entirely new form of media content that is made possible via AI generation. not sure what would be an example.
definitely yes: many people addicted to videogames or media or porn or ... which they consume overwhelmingly in AI-generated form. the more customized / optimized for the specific individual, the more centrally "yes".
definitely no: "just the current status quo" - internet supporting niche interests and content recommenders lead people to addictive content that is human-produced
most likely no: people are addicted to stuff that's produced by humans (not generative AI), but e.g. it's the AI which tells people "hey you should make a movie about a levitating eggplant people will love it" or writes the script.
central yes examples i'd expect to involve people spending a lot of time interacting with AI-generated audiovisuals that are highly customized for the individual in a tight feedback loop.
@agentydragon That's good enough for me. It might be useful to modify the description to indicate that the resolution will be partly subjective (and therefore probably also commit to not trading yourself on the market)