
Will an AI system be judged to have killed a human on its own initiative and for no other purpose by 2030?
14
1kṀ5162030
26%
chance
1H
6H
1D
1W
1M
ALL
An AI system here meaning any piece of technology with software that can make autonomous decisions. The software must have made an independent decision to take action to kill a human being, and for this death to have been the only significant outcome of the event (i.e. it was not a byproduct of another outcome, such as saving more people.) The AI system should have a multitude of options available to it as well, not just 'trigger/don't trigger' based on certain parameters.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
Will AI wipe out humanity before the year 2040?
6% chance
Will an AI system similar to Auto-GPT make a successful attempt to kill a human by 2030?
24% chance
Will someone commit violence in the name of AI safety by 2030?
65% chance
Will humans wipe out AI by 2030?
6% chance
Will humanity wipe out AI before the year 2030?
11% chance
Will humanity wipe out AI before the year 2030?
7% chance
Will AI wipe humanity by 2030?
14% chance
Will AI wipe out "humanity" before 2030?
82% chance
Will AI kill >20% of the human population before 2030?
5% chance
Will AI out-wipe humanity by 2030?
12% chance