Resolution criteria clarifications:
If the moon is being mined already and it takes over those operations, I'd count that, though I suspect that scenario is relatively low probability given how rapidly accelerating AI development is right now compared to super stagnant space exploration.
I'm defining ASI as an AI system that can outperform 100% of the human population at any cognitive task. Since software can be easily copied and ran in parallel or have its clock speed ramped up, it's also not hard to see how the ASI could also outperform the cumulative efforts of all of humanity combined.
For lunar mining, I'm defining it as industrial extraction of materials from the moon for the purpose of manufacturing. This includes using for fuel (e.g. helium-3 for fusion reactors, hydrogen for rockets) or for supporting life (e.g. water ice). The main thing is that it's extracted, processed, and used for some kind of practical technology, rather than just left as is and observed like for science or artistic collection (e.g. an astronaut grabbing a single moon rock for science doesn't count).
By industrial extraction, in this case I mean building systems designed to efficiently extract large amounts of material over a long period of time (e.g. smelting a single gram of ore once as a test wouldn't count).
Arguments raised for:
The moon might be considered a much safer testing ground for ASI capabilities than Earth.
We may not know we have created ASI before running it, and we are using little guardrails currently.
Any guardrails put in place may be inadequate due to lack of people taking possibility of ASI seriously, and possible security flaws in our current systems.Once the Earth runs out of certain resources, the moon is the nearest place to get more.
At the upper bound, aligned ASI may be able to expand its influence near light speed.
Arguments raised against:
ASI may be on very heavy guardrails for years after creation.
We still haven't started mining Antarctica a century after the first human settlements.
An aligned ASI may not expand to cosmic influence before we had actually given it the go-ahead, and the section of humanity that controls the ASI might be extremely hesitant and/or unambitious.
@RemNi fair enough. It has also occurred to me that we still haven't started mining Antarctica a century after the first human settlements, but on the other hand the moon might be considered a much safer testing ground for the capabilities of the ASI than Earth
It's likely that whenever ASI emerges it will be on very heavy guardrails for the following 12 months at least
I think we will not really know the program that will become an ASI before running it, and we mostly are not using any guardrails currently.
Also, I except the guardrails to be completely inadequate given that most people don't take it seriously, and our systems are full of security flaws.
@TheAllMemeingEye I believe the value in mining the moon is not unique resources but a unique orbital position (out of Earth's gravity well specifically)
@firstuserhere Haha yeah I didn't think anyone would break through @dionisos 's yes bet defences but you bet so much the other way that it might actually fall
@NivlacM That is definitely the upper bound, but it's also highly plausible that an aligned ASI wouldn't expand to cosmic influence before we had actually given it the go-ahead, and the section of humanity that controls the ASI might be extremely hesitant and/or unambitious
@robm To clarify, I mean the CEOs and boards of powerful AI companies, along with the governments of powerful nations, NOT conspiracy theory entities
@TheAllMemeingEye Yes ^^ I think that we will have more technology before the ASI, and an ASI would quickly take power and spread.
@Nat I'd count that, though I suspect that scenario is relatively low probability given how rapidly accelerating AI development is right now compared to super stagnant space exploration
@TheAllMemeingEye I'm not sure it's that low of a probability, but I guess it does depend on how you define ASI and lunar mining - so how are you defining them?
@Nat I'm defining ASI as an AI system that can outperform 100% of the human population at any cognitive task. Since software can be easily copied and ran in parallel or have its clock speed ramped up, it's also not hard to see how the ASI could also outperform the cumulative efforts of all of humanity combined.
For lunar mining, I'm defining it as industrial extraction of minerals from the moon for the purpose of manufacturing, so an astronaut grabbing a single moon rock for science doesn't count.
@TheAllMemeingEye Does the ASI have to be relatively agentic? Or (while I doubt it'd be the case) if it turns out that we can just keep scaling LLMs until we have a chatbot that when asked can come up with the GUT of physics and play chess better than Magnus Carlson and program better version of itself etc, would that count?
Also: for lunar mining, you said the mined material needs to be used in manufacturing but what about using it for fuel or for supporting life? Some of the of the most likely things to be mined are water ice (if we can find good sources of it) to support life and operations on the moon; helium-3 for powering fusion reactors on Earth; and things like hydrogen for fuelling rockets. Would these count?
Also, could you clarify what you mean by 'industrial extraction'?
(Sorry for all the questions!)
@Nat To me agency just means ability to be given an objective and attempt to carry it out, and it seems to me LLMs already have this, right?
Those all also seem like valid examples of mining, arguably since processing is required before use, one could still count these as manufacturing in a way, I guess the main thing is that it's extracted, processed, and used for some kind of practical technology rather than just left as is and observed like for science or artistic collection.
By industrial, in this case I mean like building systems designed to efficiently extract large amounts of material over a long period of time, so smelting a single gram of ore once as a test wouldn't intuitively seem to count to me.