Is there a "pivotal weak act"?
Basic
15
458
2026
15%
chance

A pivotal act is one that prevents other actors from building an unaligned AGI that kills all humans.

A weak act is one that can be performed by an actor that is not capable of killing all humans, and does not require creation of such an actor.

Eliezer Yudkowsky provides two examples:

  • Releasing nanomachines that hunt down and burn all GPUs is a "pivotal act" but not a "weak act". With small modifications such nanomachines could kill all humans.

  • Improving public epistemology by using GPT-4 to provide scientifically literate arguments is a "weak act" but not a "pivotal act".

A "pivotal weak act" is a "pivotal act" that is also a "weak act".

This market resolves YES if I'm convinced that a pivotal weak act exists or existed at any point between 2023-01-01 and the creation of the first AGI.

This market resolves NO if I'm convinced that no such act existed at any point between 2023-01-01 and the creation of the first AGI.

This market can resolve to a probability or to N/A if I'm convinced this is the most accurate resolution.

Background: https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities

Get Ṁ600 play money
Sort by:

Couldn't we simply yeet some people out of our lightcone and then nuke earth.

Does the act need to be theoretically possible given a substantial number of humans cooperating, or practically possible on Earth?

Example: there might potentially be pivotal weak acts that require cooperation among all sufficiently advanced chip foundries, but that seems unlikely to be practically possible on Earth.

@josh It needs to prevent other actors, so I think acts that require unanimity in a large class of actors, eg all chip foundries, would not count. But an act by the leading three chip foundries would count. There's a grey area between those examples.

Related questions