Related questions
The two nearest existential threats facing humanity right now are global warming and AI. Global warming is very unlikely to actually wipe us out completely even in the worst case scenario, with the worst case scenario becoming increasingly unlikely with time. It's simply a problem that has given us plenty of time to notice it and then act upon it. An AI apocalypse would give us none of that. The moment a self-improving AGI is out there, if it's not properly aligned with our goals it's immediately over. The only way to act upon this problem is to act before it's even tangible.
That said, being a doomer about a problem that is so far away and out of your individual control is pointless.
@OnurcanYasar So “no” if the timeline is the remaining lifespan of the universe.
Murphy’s Law, anyone?
@Ophiuchus However it needs to be doom because of AI not because robots that were our friends got into war with an alien civilization or something like that. If we assume we will die with the universe no matter what we do we are doomed anyways but that’s something else obviously