ASI = Artificial Super Intelligence.
Most arguments suggesting that ASI would destroy humanity can be extended to any other entity potentially competing with ASI for resources, including aliens. Are they going to do something about it?
ASI is expected to occur by 2040 at the latest.
In 2040 I will run a poll about what happened (based on the information available then) and resolve accordingly.
Update 2025-02-20 (PST) (AI summary of creator comment): Clarified Resolution Criteria:
Great Filter: Resolves to option 1, meaning that there are no technologically advanced aliens that can influence us.
Dark Forest: Depends on alien response:
Resolves to option 3 if aliens decide to sit by, viewing our ASI as a smaller threat compared to other aliens.
Resolves to option 4 if they choose to preemptively attack us.
Grabby Aliens: Resolves to option 1 if they are not advanced enough to stop us (with option 1 indicating the absence of technologically advanced aliens able to influence us).
Alien Ocean: Also resolves to option 1 unless intervention occurs without physical contact (for example, by sending a warning message).
Additional Note: If ASI is inevitable and becomes a paperclip maximizer, these scenarios might evolve; for instance, a civilization developing ASI may rapidly transform an Alien Ocean or Dark Forest scenario into one resembling Grabby Aliens, as long as resource constraints allow.
I think any aliens capable of stopping us from developing ASI would have ASI/be an ASI themselves. So I think it's more likely they don't stop us for the same reasons that we choose not to step on an ant while taking a walk.
@spiderduckpig if their ASI is a paperclip maximizer, what would be the point of letting us live though? We don’t step on ants because neither we nor ants are paperclip maximizers
@mariopasquato well ASI doesn't necessarily have to be a paperclip maximizer. I don't think human civilization would fit most definitions of a paperclip maximizer (or maybe it does), but I think a sufficiently advanced alien civilization could develop ASI with as nuanced motivations as humanity. I mean, we already have LLMs that can explain ethical concepts better than the average person
And I assume that there are no paperclip maximizers in our light cone just because of anthropic bias. If there was a paperclip maximizer in our light cone, we couldn't have developed in the first place
How do the following scenarios resolve ?
Great filter: all other aliens in our past light cone are either still stuck in the precambrian or already wiped themselves out via non-ASI means e.g. nuclear war, bioweapon pandemic etc.
Dark forest: all other aliens in our past light cone are hiding from each other to avoid getting wiped out by pre-emptive first-strike attacks with interstellar weapons of mass destruction
Grabby aliens: all other aliens in our past light cone are at a relatively similar early stage of technological development to us and simply haven't reached us yet
Alien ocean: all other aliens in our past light cone rarely try to expand beyond their star systems because the travel times are so long they can't reliably keep control over any interstellar colonies
@TheAllMemeingEye Great filter resolves to option 1. Dark forest depends on what they end up doing: so either option 3 (if they sit by, thinking that our ASI is a smaller threat than other aliens attacking them) or 4 (if they preemptively attack us). Grabby aliens resolves to option 1 if they are not advanced enough to stop us (option 1 should then be interpreted to mean “there are no technologically advanced aliens THAT CAN INFLUENCE US”). Similarly, with this change, Alien ocean would resolve to 1. unless they manage to stop us without coming here, e.g. by sending a warning message. Note that if ASI is inevitable and it invariably becomes a paperclip maximizer then some of these scenarios are unstable, as they are destined to evolve quickly to something else. For instance if at least one civilization develops ASI then it will turn Alien ocean and Dark forest into Grabby aliens by seeking out other civilizations and destroying them, as long as this is compatible with its resource constraints.
@asmith interestingly, if this is because of some sort of great filter, the filter cannot be ASI explosion: otherwise their alien ASI would intervene to prevent us from creating a competitor. In fact it should have already intervened.