
Artificial superintelligence (ASI) here means any artificial intelligence able to carry out any cognitive task better than 100% of the unenhanced biological human population.
Safe-ish ASI here means ASI that does not wipe out humanity.
Able here could mean actual agentic use, just a test demonstration, or just scientific theoretical proof of having that capability that is accepted as fact by the majority of the human scientific community.
Hacking a mind here means sensory inputs that through mere exposure would involuntarily (regardless of previous desires/beliefs/precommitments) enable total predictable control of the mind (arbitrarily large changes of judgment/personality, lethal effects such as seizures etc). Examples in fiction include David Langford's BLITs/basilisks and SCP's memetic kill agents.
It does not include persuasion with high-quality logical arguments, ordinary voluntary hypnosis with substantial cooperation and effort from the subject, generated fake content / illusions acting as false evidence for something, direct physical alteration of the mind itself, or coersion via threats / torture.
Human here means unenhanced biological human, so doesn't include humans with advanced brain-computer interfaces, human brain upload emulations, posthuman species, or enhanced non-human animals.
If somehow the ability to hack human minds is already discovered before ASI, however unlikely, this should probably resolve yes.
@TheAllMemeingEye i think it's a different problem, but way easier. i meant the question in the sense of: "is this the scenario you meant?" - or should "hacking a mind" be independent of the specific person? like a hack.mp4, that works most of the time?
if you have real time feedback, then the ai could produce effects, that are currently not possible, like visualizing what the person thinks or creating weird effects depending on where the person looks at and how they feel. - i think this would not fall under the excluded "generated fake content / illusions as fake evidence" or the "ordinary hypnosis" category
Another fictional example would be the Sumerian "me" from Stephenson's Snow Crash.