Neurotechnology is most relevant for AI safety with respect to...
18
Never closes
BCIs extracting human knowledge
BCIs creating a reward signal
Neurotech enhancing humans
Understanding human value formation
Cyborgism
Whole brain emulation
Hat tip to Sumner Norman and Lisa Thiergart. More info on the neurotech definitions and their potential AI safety use cases: https://www.lesswrong.com/posts/KQSpRoQBz7f6FcXt3/distillation-of-neurotech-and-alignment-workshop-january-1
Disclaimers:
This question is part of Foresight’s 2023 Vision Weekends to help spark discussion amongst participants, so the phrasing and resolution criteria may be vaguer than I would normally like for this site. Apologies for that. We thought it would still be useful to make the market public to potentially inform other discussions.
If you would to add alternative answers, please do so in the comments!
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Related questions
Will neurotechnology enable AI to predict and classify human decisions, along with their influencing factors, by 2030?
59% chance
Will something AI-related be an actual infohazard?
69% chance
Will prioritizing corrigible AI produce safe results?
45% chance
AI safety activist causes AI catastrophe to slow development before 2040?
15% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
I am an AI safety researcher with a background in machine learning engineering and neuroscience. Will I personally be able to program and train an AGI for less than $10k by 2030?
20% chance
What will be the main constraint to AI development in 2028?
Will @EliezerYudkowsky reverse his opinion on AI safety, before 2030?
10% chance
Is AI Safety a grift?
22% chance