Is it delusional to put AI doom at 99% likelihood or above?
79
Never closes
Yes
No

Get Ṁ1,000 play money
Sort by:

It’s irrational

The two nearest existential threats facing humanity right now are global warming and AI. Global warming is very unlikely to actually wipe us out completely even in the worst case scenario, with the worst case scenario becoming increasingly unlikely with time. It's simply a problem that has given us plenty of time to notice it and then act upon it. An AI apocalypse would give us none of that. The moment a self-improving AGI is out there, if it's not properly aligned with our goals it's immediately over. The only way to act upon this problem is to act before it's even tangible.

That said, being a doomer about a problem that is so far away and out of your individual control is pointless.

Even unaligned AI can’t just kill of all humans. The worst thing it could do is limited to biological warfare, which humans can do to each other as well

I think it's wrong, but "delusional" maybe a strong word. I can see some assumptions and reasoning that could lead to this estimation.

The most important thing Manifold has taught me is to be less confident in my predictions. And that is over less than a year!

Meditating on that.

@MatthewRitter Yes the stock market teaches that too though.

what's the timeline? And what's your understanding of doom?

@sarius Any timeline, doom = end of positive trajectory of human welfare

@OnurcanYasar So “no” if the timeline is the remaining lifespan of the universe.

Murphy’s Law, anyone?

@Ophiuchus However it needs to be doom because of AI not because robots that were our friends got into war with an alien civilization or something like that. If we assume we will die with the universe no matter what we do we are doomed anyways but that’s something else obviously

even more than that - it is deluuuulu