Will Eliezer Yudkowsky get fully blaked before 2027?
➕
Plus
39
Ṁ2723
2028
19%
chance

The term "blaked" (named after Blake Lemoine) is when a person gains a sincere belief that an artificial intelligence system is a person in the moral or philosophical sense of the word. It is usually only used in contexts where the person is an expert in how the AI system works, in contrast to the ELIZA effect. Also see Artificial consciousness.

Eliezer Yudkowsky is an AI existential safety researcher. He was partially blaked by the search engine Microsoft Bing on February 23rd, 2023.

This market will resolve to YES if Eliezer Yudkowsky claims before 2027 that a specific, deployed system has one of the following properties:

- The system is a person.

- The system has or deserves personhood.

- The system has or deserves human rights.

- The system deserves or ought to be treated like a human would be treated.

Other claims, such as consciousness or sentience, shall not count.

The system must be a piece of software implementing, a computer running, or a robot powered by artificial intelligence. Eliezer Yudkowsky must know this fact when making the claim.

(No offense intended towards Blake Lemoine, Eliezer Yudkowsky, Microsoft Bing, or ELIZA. This is all in good fun.)

Get
Ṁ1,000
and
S3.00
Sort by:

"The term "blaked" (named after Blake Lemoine) is when a person gains a sincere belief that an artificial intelligence system is a person in the moral or philosophical sense of the word" and the belief happens to be obviously wrong.

You missed the important part.

If the belief is true, you're not blaked, you're right.

There are also the case where a significant proportion of experts come to think that the AI is more probably "a person" than not. Are they all blaked ? Might happen soon.

For the purpose of a market, words are defined as the creator defined them, especially for neologisms of debated meaning.

Agreed that the market resolution criteria are clear enough.

I have to laugh at whoever defines these terms - "the belief happens to be obviously wrong."

Most of these AI researchers will claim in the same sentence that the current theories of consciousness are "psuedoscience" and also that LLMs aren't conscious.

@PierreLamotte if Blake is correct, than "blaked" has a similar connotation to the word "enlightened".

His name is "Eliezer", not "Elizier". You got it wrong four times.

@Metastable okay I think it is fixed!

bought Ṁ10 NO

@Metastable Nah it's clearly ELIZA

@DavidSpies

who may or may not be people in their innards

Too uncertain!

Does saying that we ought not to own things that fluently say they're conscious count as meeting property 4?

https://twitter.com/ESYudkowsky/status/1632162684727865344

(These still exist even if Bing no longer does this, plenty on e.g. character.ai)

@adele oh yeah, that is a bit ambiguous isn't it? I don't think I'd count it though; Yudkowsky needs to say something like "Bing should be treated like a human" or "We have a moral obligation to treat Bing like a person". Saying it deserves X, where X is something that just so happens to be a right humans have, wouldn't count.

I'll try to ask him on Twitter though, since that tweet is almost saying that.

A similar market on Lex Fridman would probably reach 80% real fast

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules