If the progress of AI experiences a slowdown before 2030, what might be the cause?
29
146
1.6K
2031
84%
Diminishing algorithmic advancements.
75%
Strict regulation
59%
Hardware limitations
45%
Availability of data to train on is a bottleneck
45%
Major supply chain disruptions
22%
Declining job opportunities in the field
16%
Widespread poisoning attacks making it harder to compile good training datasets.

If you have suggestions for a reason other than those already available as an option, please comment and I will add it. I am not allowing free-response to mitigate spam of low-quality answers.

Resolution will be to either YES/NO for each option that qualifies if it is obvious, else resolves to PROB (default : 50%) for an ambiguous "sort of but not fully" situation.

Multiple slowdowns can happen. For example, if supply chain issues cause it but we recover from those and then start running into hardware bottlenecks, both of those options will be resolved to YES.

Get Ṁ200 play money
Sort by:

Widespread poisoning attacks making it harder to compile good training datasets.

@mariopasquato Good one!

bought Ṁ10 of Major supply chain d... NO

Hype dying down causing a decrease in resources (compute and researchers)

Public backlash and legislation against AGI risks.

Public backlash and legislation against non-AGI related ethical issues.

@Shump Added

Strict regulation

and

Declining job opportunities in the field

Does this condition on there being an AI progress slowdown?

@Nikola Yes it does.

Something pandemic-adjacent disrupts supply chains. Maybe just group all global catastrophes into one answer

@Nikola Added

Major Supply chain disruptions

Feel free to suggest other possibilities, if any.

@firstuserhere Political opposition to AI development. A legal system that cannot change quickly or effectively enough to facilitate development prevents rollout via privacy, copyright, liability laws, etc. OR voting pressure on democratically elected politicians from economically/religiously fearful electorates makes them vote against funding research/increases burdensome regulation requirements/breaks up a company said to have a “monopoly” to a degree it makes real AI progress for them no longer financially feasible.

(Would “WWIII” fall under “major supply chain disruptions”?)

More related questions