I will resolve this based on some combination of how much it gets talked about in elections, how much money goes to interest groups on both topics, and how much of the "political conversation" seems to be about either.
@benjaminIkuta This looks like misinformation to me? It cites a WSJ Feb 21-28 poll (which I believe is https://s.wsj.net/public/resources/documents/WSJ_Partial_Results_Feb_2024.pdf), but the numbers don’t match, and the poll doesn’t split by gender…
This seems really high, any novel issue becoming as large as abortion politically in 4 years would be surprising.
I just don't see how you can get both parties to furiously disagree on ai fast enough for that, especially among old politicians who probably won't have a deep enough understanding to be aggressively for or against it.
The major parties may not have distinct disagreements on AI policy yet, but they do have different economic views, views about the role of government etc.
So if AI, say, leads to job losses the parties will disagree on what needs to be done about that because that’s downstream of their economic views.
My current thinking is:
AI will lead to massive economic, cultural and societal changes over the coming years
We will need to create new laws & figure out how to organise society in light of these new changes (much as we have for previous technological revolutions)
This is an inherently political process & given its magnitude it will therefore be the biggest political issue.
The public’s priorities can change very quickly.
e.g. Terrorism became the biggest issue practically overnight after 9/11
9/11 is not the same as AI taking jobs. The attacks happened in a single day but job loss will occur over many years if at all. I think it's much more likely that AI leads to an increase in productivity for fields that are not currently saturated with productivity so almost no jobs are lost.
Cultural changes will be very minimal for a large number of fields (anything involving physical labor).
I used the extreme example of 9/11 to counter your point about it being unlikely for AI to become a significant political issue in 4 years (ie if it can happen overnight it can certainly happen over 4 years)
I suppose the question fundamentally comes down to AI timelines. If AGI is created by 2028 given it’s generalizable intelligence it would follow that it would affect a broad number of jobs as it’s effectively as capable as a human.
Going back to your first comment, the reason why this market is priced so high is that many people believe that AGI or something close to it is possible by 2028. If it’s just a tool/productivity enhancer then that’s conditional on the next generation of models (gpt5, gpt6) plateauing in their rate of improvement.
Question as currently phrased offers too many outs for this to be a coin flip.
Even disregarding mainline NO where 2028 is too soon, contention could peak pre-2028 then cool off, or AI could transform its own perceived issues by then, such that the AI origin of the issue is now a forgone conclusion and we're arguing over a distant descendant ("the internet" is not the reason social media ends up under the magnifying glass, though it's a practical root cause).
IMO, AI's disruptive potential lies in incremental, widespread integration rather than sudden upheaval. This integration will likely occur below the threshold of political panic, particularly among the masses (easily swayed, and AI as a flashpoint doesn't serve big tech - they'll find a way to fine-tune the hype:alarm ratio).
Expectation: AI becomes too normalized to generate sustained controversy.
Massively underpriced. See already strikes in entertainment and now that capabilities have advanced, the same crowd plus more are starting to develop strong disdain and even hatred for anything created by AI. I saw reaction to a gallery display of AI art on Twitter, plus audible boos at SXSW. Thats just art.
When it becomes a misiniformation and privacy concern then it will be a general public issue. It's just starting.
No need to intellectualize this. It's about the visceral emotional response AI is evoking from people, it will be an issue.
https://variety.com/2024/tv/news/sxsw-audiences-boo-videos-artificial-intelligence-ai-1235940454/
@SimranRahman I don't think artists on Twitter are a good source to extrapolate what normal people are going to think.
@nikki @ShakedKoplewitz I can use Twitter to provide evidence while also knowing that it's obviously not representative of the current, real population...an objective and obvious fact doesn't need to be stated. ☺️
However, I hail from VC and I'm always searching for early indicators and signals, which is exactly what Twitter can provide. This market resolves in 2028. So again. What makes something an issue as big as abortion? Two things in my observation. 1) Evokes strong emotional response just like the issue of abortion does and/or 2) affects the day to day life of citizens (which I believe is the sentiment of Nikkis reply). Productivity gains from AI will affect the entire economy and job market— issue #3 in that graph. Not today (problem with that graph) but in future and starting today (my argument and spirit of this market). Here's a detailed analysis from the IMF stating the impacts of AI on the global economy since we're nitpicking on sources here. (IMF 01/2024)
Now please counter the actual content of the argument. 😇
I also network heavily in my city of business, a major metropolitan city of the US, Top 10 and the fastest growing by % population change in 2023. Every senior executive I speak with about AI has formed an opinion on it, as well as every citizen I've spoken to. It's rarely neutral, it's some form of positive or negative. Try it and report back here. I'm genuinely curious. This market focuses on political conversation which requires temperature testing with the opinions of the public either online or in person.
If you disagree with me, you are welcome to bet Mana and prove your conviction in your own opinions on what the conversation around AI will look like in 2028.
All the best 👍
@nikki I made a market based on Gallup's polling! It might be good for the people who are worried about the resolution of this market being subjective.
@SimranRahman to be explicit, I think it's a poor indicator here because this is a sort of issue that seems tailored to appeal to Twitter artist type (people who spend a lot of time online making money off selling art and building a personal brand) in a way that doesn't extrapolate to normal people (see also how Twitter artists care a lot more about things like how much streaming services pay musicians but that's never gone mainstream).
To the degree that normal people probably will worry more about AI stuff over time, that's probably more likely to be existential risk/gradual loss of control/actual large scale unemployment, which would start out being stated by different people using different attitudes.
@ShakedKoplewitz Thanks for clarifying your position and I can also clarify my full analogy/ reason I find this extreme emotional evocation compelling as evidence.
I was essentially drawing a parallel between the rhetoric of the extreme sides of those in the abortion debate (eg murder, lives/livelihood, etc list goes on). I am fully aware that most in the AI and in parallel the abortion discussion do not take extreme positions though they may have an opinion. But the fact that there are people on both sides that do, is what makes it a major issue. This strong polarization is what constitutes a political issue.
There are people in the abortion discussion who invoke God/religion as the pure and sole justification for their argument, there are people that invoke data from the ACLU to argue their position. There are people who are genuinely neutral on the topic. They are ALL part of the same political discussion when it comes to abortion.
Abortion is a major issue precisely because people exist that feel strongly about it, who are loud and take action to make and advocate for legislation either way for/against. My position is that if this emotional response criteria is met for any given issue and plus the impact to real life brought up above, you will have a major political issue. I believe AI will trigger enough of an emotional response and real life impact by 2028 that we will see clear polarization on the topic.
Thanks
@nikki Yeah, these people live in an echo chamber where everyone talks about AI all the time. Meanwhile, the average person goes to church and hasn't even tried ChatGPT.
@Snarflak AI is a small political issue now. I hold YES not because I see people concerned with AI now, but because I believe that a lot will change over the next 4 years that will make people concerned about AI-related issues.
Church/religious center attendance vs ChatGPT usage. Echo chamber?
I don't particularly care to talk about state of religious attendance but your point on this being the "real" state of affairs and attempting to claim AI is an internet-only bubble is objectively false. Usage is only growing and as adoption increases across academia, industry, and government the issue will be at the forefront of the American mind. Cheers 😇
AI:
Religion:
Weekly Attendance https://www.graphsaboutreligion.com/p/how-many-weekly-attenders-are-there
If you can stand in front of graphs, charts, linked sources and my multiple replies addressing all raised claims
vs. your single statement with no background data ...
And call my statement the opinion...not much else to say! 💁♀️