Buy/sell these outcomes to the probabilities you consider appropriate.
Mar 25, 6:05am: What are the probabilities of these AI outcomes? → What are the probabilities of these AI outcomes (X-risk, dystopias, utopias, in-between outcomes, status quo outcomes)?
@connorwilliams97 This is one I’d be really concerned about (perhaps instead of wiping out jobs outright, we might see wages fall behind GDP growth), but I think the one-decade timeline is too short. I’d put it at 30-100 years.
@connorwilliams97 This is a parimutuel market which is now deprecated, I would consider remaking this market with the new and much improved multiple choice market type.
@IsaacKing Obviously if A happens it won't resolve since there will be no one around to resolve it. Otherwise, my first attempt to think through how I'd resolve it, perhaps, is that it would be resolved after X years of relatively steady-state of one of the other outcomes. Most of the outcomes are pretty obvious to differentiate; differentiating F from G would be based on a thorough review of a combination of public opinion metrics and social-scientific evidence. Differentiating E from F would be based on whether very powerful people/AIs have seriously talked about it - it being a popular conspiracy theory wouldn't be enough.
@NLeseul if the UBI causes labor participation to drop and the drop can clearly be traced to the UBI in particular rather than to AI, that's G. But if AI causes labor participation to drop and the UBI is introduced in response to this, then that's very definitively F.
If AI-based nanoreplication puts most people out of work, that's also F.
The core tenet of G is that AI somehow doesn't reduce labor force participation significantly. That's the exact thing that differentiates it from F.
@MrMayhem Ugh. I definitely meant to type "some reason UNrelated to AI" there.
Yes, I agree that if AI is clearly the catalyst that leads to other social or technological changes, then that would indicate G.
@MarkIngraham X. Ai servers use up the remaining economical oil, nerds forced to debate alignment issues in person.