In the event of a full singularity/intellegence explosion occuring which of these things will be true after the event.
12
2kṀ25492100
72%
Life will be rated by me as overall significantly worse then before the singularity (if I am no longer alive then mods may resolve how they believe I would have the same for all such questions).
71%
Most human beings that exist at the time of the singularity will be killed.
67%
Biological human beings will no longer exist.
59%
Humans beings will no longer age.
59%
Nothing, or almost nothing, of human value will survive.
56%
Capitalism will still exist.
55%
There will be significant populations living in simulated worlds unaware of the nature of the simulated worlds.
50%
Religion will still exist.
45%
Biological human beings will make up a small minority of the sapient population.
45%
Simulations of fictional worlds from before the singularity will be run.
45%
The beings in control after the singularity will pass judgement of or punish people for actions taken before the singularity.
41%
We will run large numbers of ancestor simulations.
35%
There will be primitivist populations who avoid significant change and deliberately live a none technological life.
31%
Most human beings that exist at the time of the singularity will be uploaded.
28%
Cyronicly frozen individuals will be revived.
27%
Most human beings that exist at the time of the singularity will go on living as biological human beings.
26%
Preexisting nations will survive.
25%
Life will be rated by me as about the same quality as befote the singularity.
24%
Life will be rated by me as overall significantly better then before the singularity.
24%
War will still occur
By singularity/intellegence explosion I meen the creation of minds massively superior to those of present human beings who are capable of transforming life. I will play it by ear.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
If ASI is created and doesn't wipe out humanity, will it torture any human-level-intelligences within a year?
16% chance
There will be only one superintelligence for a sufficiently long period that it will become a singleton
64% chance
Will there be an intelligence explosion before 2035?
59% chance
If an AI wipes out humanity on Earth, what will be true of this AI?
if the singularity occurs, will humans be apportioned resources according to their pre-singularity wealth?
14% chance
If we survive artificial general intelligence, will Isaac King's success market resolve to "none of the above"?
59% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2100?
87% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2030?
44% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2050?
80% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2075?
87% chance