Will a solar storm at least as powerful as the Carrington Event hit Earth before superintelligence is created?
31%
chance

The half-baked idea occurred to me the other day that a catastrophic geomagnetic storm that ruins all our electronics might be the closest thing we could get to an Eliezer Yudkowsky style 'pivotal act' to prevent AI doom comparable to his example of 'burn all CPUs'.

For brief context, the Carrington Event geomagnetic storm hit Earth in the 1800s, and a similar sized storm narrowly missed Earth in 2012.

By 'at least as powerful as', I mean in terms of multiple physical properties of the storm (speed, mass, magnetic field warping etc).

Get
Ṁ1,000
and
S3.00
Sort by:

The Carrington Event was a G5 solar storm, and the storm hitting us now has been classified as a G5.

You mean besides this week’s storm?

@TrickyDuck according to the UK Met Office, G5 geomagnetic storms occur roughly every 3 years

https://www.metoffice.gov.uk/weather/specialist-forecasts/space-weather

Are you saying that this storm has been confirmed to match or exceed the Carrington Event in multiple physical properties (speed, mass, magnetic field warping etc)?

@TheAllMemeingEye I am not very knowledgeable, so I was thinking that maybe those factors went into classifying the categories (like G5). I also read the question quickly and didn’t mull it over too much. Sorry if it sounded like a criticism. I had seen how the experts upgraded this week’s storm from a G4 to a G 5, so I was just asking if the question may have been overtaken by events.

@TrickyDuck fair enough, I'm not an expert myself, but crudely estimating from the trend of frequency vs level (exponential decay by factors of 2 or 3 at first, then suddenly by factor of 25), it seems Carrington Event tier storms might be better described as G6 or maybe even higher

I think you’re right that it should have been called a G6 because they reportedly had sparks flying off of metal doorknobs.

Geomagnetic storms only affect one half of the earth, so at least one of Asia /. North America / Europe would emerge with their GPUs and electrical infrastructure intact (although a perfectly positioned one could hit California, Taiwan and China at the same time). It'd still be likely to delay things considerably, but it's not quite the pivotal act you might be hoping for.

@SaviorofPlant ah shit, I didn't think of that, guess we'll die then haha 🤷

@TheAllMemeingEye Why would we die?

@Snarflak Because there's a significant chance superintelligence alignment will fail, so if its creation isn't prevented (e.g. by a geomagnetic storm destroying all computing technology), it may kill us and everyone we love.

https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/

"We are not ready. We are not on track to be significantly readier in the foreseeable future. If we go ahead on this everyone will die, including children who did not choose this and did not do anything wrong." - Eliezer Yudkowsky

Obviously it's not guaranteed, but I personally believe the chances are 20-40% of death.

@TheAllMemeingEye How and why would it kill us?

@Snarflak The scenario that seems most intuitive to me is:

  • Billionaire owner tells superintelligence to increase company profits

  • Superintelligence interprets this vague order as the objective utility function to maximise the quantity of digitally stored money in its future light cone

  • Superintelligence realises that it would be instrumentally helpful for achieving this to ensure its own survival, to prevent its objectives from ever being changed, and to maximise its own power

  • Superintelligence realises that humanity poses a threat to these objectives

  • Superintelligence convinces humans to give it internet access

  • Superintelligence determines the genome for a pathogen with the lethality of rabies, the infectivity of measles, and a delayed onset of lethal symptoms

  • Superintelligence pays a GMO microbe company to synthesise this genome and mail it to them, as is common practice in biotech

  • Superintelligence convinces someone to open the package

  • A few months of busywork later, all of human civilisation rapidly dies out, before anyone can really coordinate and react

  • Superintelligence takes control of automated manufacturing infrastructure to produce self-replicating machines capable of operating other tech

  • Superintelligence uses exponentially growing population of machines to convert all suitable material on Earth into computer bits storing digital money

  • Once Earth runs out, superintelligence expands influence into Solar System and Galaxy

More detailed reasoning and varied examples can be found at the pages listed below:

@TheAllMemeingEye Oh, I thought you had something plausible in mind. Yes I've heard of these ridiculous scenarios before.

@Snarflak which stages do you think are most implausible and why?

@TheAllMemeingEye

Superintelligence interprets this vague order as the objective utility function to maximise the quantity of digitally stored money in its future light cone

Anything that misunderstands its objective like this is not "superintelligent".

Superintelligence realises that humanity poses a threat to these objectives

Humanity cannot pose a threat to something that has no meaning outside the context of humanity.

and a delayed onset of lethal symptoms

How?

all of human civilisation rapidly dies out

Even the disease you described isn't capable of this.

"At least as powerful as the Carrington Event" would be more than 1 billion deaths, right?

@singer I mean in terms of physical properties of the storm (speed, mass, magnetic field warping etc) rather than its end result, though based on the Kurzgesagt video I gather it could range from a few hundreds of deaths to billions depending on power grid resilience and knock on effects to agriculture

Relevant arbitration opportunities:

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules