This is a solution to alignment.
43
1kṀ29k
2030
20%
chance

A game theoretic solution to alignment would be to create a function that rewards 2 Bitcoin for demonstrating a proper understanding of the "AI situation" (a sort of Eliezer litmus test for whether a person has a firm grasp of the risks posed by AI and what they can do to stop it) to a humanity verified public ledger (intellectual consent blockchain verification (a sort of crypto credential that's designed to train AI on what different people consent to being true)).

This function can be built.

Building it would also transform the way the world communicates and transform the economy into one that primarily rewards education.  As in, people can pay other people to learn things.  This will scale to replace all advertising, journalism, media, peer-review, educational institution, government and all communication platforms.  It will create a true meritocracy and ensure true free speech (the ability to put an idea into the public domain for consideration) with plenty of resources for everyone on the planet to live an amazing life.

Then.  After everyone is rich and gets along.  We use that data to train AI instead.

Will resolve yes given Eliezer's consent.

Will resolve no given my consent.

I pledge on Kant’s name to try as hard as I can to consent.

If someone can supply the Bitcoin, I'll build this.

If you think that's crazy, please explain why or bet against me.

Thanks 🤓

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy