If @Aella and I go on a "proper date", we will marry, have offspring, and control the world with intellectual force.
32
1kṀ10k
2030
49%
chance

If @Aella goes on a "proper date" with me (@Krantz), then she will marry me, have offspring with me, and pivot her career to helping me demonstrably take control of the world using intellectual force.

For this prediction, "proper date" means "spend at least 24 hours charitably questioning each other's philosophy inside a SCIF (secured compartmentalized information facility)".

For this proposition to resolve "yes" the following propositions must simultaneously be true (according to Aella) at some point within 5 years of the proper date occurring.

1. Aella and I are married.

2. Aella and I have produced biological offspring.

3. Aella designates her formal employer as "krantz".

If a proper date between Aella and I occurs and these propositions fail to be simultaneously true (according to her) within 5 years after said date, this prediction will resolve "no".

Aella will be the judge of whether a "proper date" has occurred according to the criteria defined above.

If the date occurs and Aella requests this prediction to resolve "no", it will resolve "no" immediately.

If a proper date does not occur, this prediction will resolve "NA" in 2030.

If Aella requires an additional deposit of value to compensate for the time and effort required to attend the date, she should reach out privately so I can provide payment details.

Get
Ṁ1,000
to start trading!
Sort by:

Btw @Krantz what is your unironic credence for this? It seems somewhat likely you might be betting non epistemically so far

@TheAllMemeingEye If my epistemology seems unsound to you, maybe you should map it in the form of a proper argument in the demonstration mechanism I'm trying to pay you to use.

@Krantz personally, my current credences are:

  • P(Aella goes on a "proper date" with Krantz, i.e. "makes any half-way decent attempt to actually discuss the in-depth issue in private") = ~10%

  • P(Aella will broadly agree with Krantz's AI alignment strategy, given the discussion) = ~10%

  • P(Aella will marry Krantz, given the discussion) = ~0.1%

  • P(Aella will have offspring with Krantz, given the discussion) = ~0.1%

  • P(Aella will pivot her career to helping Krantz take control of the world, given the discussion) = ~1%

  • P(all of the above) = ~0.01%

What would you say is the correct value for each?

opened a Ṁ500 NO at 70% order

@TheAllMemeingEye You forgot P(Successfully controlling the world with intellectual force)

@IsaacLinn arguably the title, start of description, and middle of description give 3 very different criteria.

Title

and control the world with intellectual force.

Start of description

and pivot her career to helping me demonstrably take control of the world using intellectual force.

Middle of description

Aella designates her formal employer as "krantz".

Your proposal applies to the first but not the other 2.

@TheAllMemeingEye These are great inquiries. I'm glad that you are trying to engage analytically to pin down my particular confidences on these particular propositions. That's something I wish more people would do.

I have to ask though, since the entire purpose of this prediction (along with nearly all of my predictions) is to teach this community how to use a particular mechanism to decentrally survey this specific form of information, why do you not simply use that mechanism to query my confidence as opposed to writing it out in the comments?

If you went to the krantz demonstration mechanism and added the following propositions:

1. Aella will go on a proper date with Krantz.

2. Aella will broadly agree with Krantz's alignment strategy.

3. Aella will marry Krantz.

4. Aella and Krantz will produce offspring.

5. Aella will use the Krantz mechanism to produce the majority of her income.

6. If 1, then 2.

7. If 1, then 3.

8. If 1, then 4.

9. If 1, then 5.

10. If 1, then (2, 3, 4 and 5).

If you did this, you would demonstrate that you understand the form of communication I am trying to teach the world how to use. You would also be able to view and operate on everyone's confidence for each proposition.

In general, it feels like a waste of time responding to everyone's cynical remarks in the comments section. If someone genuinely wanted to intellectually force me to update my priors, the obvious way to charitably do so would be through writing a valid argument within the krantz mechanism that compells me to confront any possible contradictions publicly.

Nobody seems capable of doing that.

@Krantz Here's a bounty for anyone capable of intellectually forcing me to update my priors (about any topic you'd like) using the mechanism I keep advocating for.

https://manifold.markets/Krantz/will-anyone-write-an-argument-that

@Krantz

since the entire purpose of this prediction (along with nearly all of my predictions) is to teach this community how to use a particular mechanism to decentrally survey this specific form of information, why do you not simply use that mechanism to query my confidence as opposed to writing it out in the comments?

[...]

If someone genuinely wanted to intellectually force me to update my priors, the obvious way to charitably do so would be through writing a valid argument within the krantz mechanism that compells me to confront any possible contradictions publicly.

If you want more people to understand and try to use your mechanism, then my recommendation is the same as I commented in your other market:

Could you write a 1-3 paragraph layman language explanation of the Krantz mechanism?

i.e. don't use any phrases you wouldn't expect an average English speaking person to already understand, certainly don't use any phrases you've invented or redefined, do a Toki-Pona-ing / Yudkowskian-tabooing ( https://www.lesswrong.com/posts/WBdvyyHLdxZSAMmoz/taboo-your-words ) if need be. Existing explanations you've shared seem to be either extremely long or include a bunch of opaque invented/redefined language e.g. krantz-x, constitution, ledger, collective intelligence etc.

For this prediction, "proper date" means "spend at least 24 hours charitably questioning each other's philosophy inside a SCIF (secured compartmentalized information facility)".

A couple questions:

  • Why call this a "proper date"? This seems to be highly at odds with the intuitive definition

  • Is this 24h continuous, with no breaks, not even for eating or sleeping? If there are breaks, what's the maximum length and frequency?

  • What even is a secured compartmentalized information facility? Does it mean a university building with security guards?

@LiamZ thanks for the explanation 👍

It raises the follow-up question of whether krantz would count informal improvised attempts at making one (e.g. a room with civilian soundproofing and checked for bugs) or would he only count one officially designated as such by a government or military?

@TheAllMemeingEye I am going to be more than happy to resolve this yes if any half-way decent attempt is made by @Aella to actually discuss an in depth issue in private.

@Krantz Or NO, if you don't satisfy the other three conditions. Right?

@LiamZ Im glad we now know that no one has ever gone on a serious date ever

Free loan market though

@Gurkenglas on his Elizier market he actually got the guy to engage but he still refuses to resolve it. It would likely be similar here, if she rejects him it's not going to mean “NO” but rather that it “wasn't a proper date.”

@LiamZ good point

@Krantz can you confirm that you are indeed willing to resolve no in such cases and won't shift the goal posts?

@TheAllMemeingEye ignoring this is not surprising but pretty funny.

bought Ṁ20 NO

42%‽ That's an insanely high probabili- oh right

I love this community lol

@IsaacLinn Manifold, where silly people go to act smart and smart people go to act silly.

Damn so no one in history has ever gone on a proper date before?

This seems like a clear NO, but I'm highly wary of betting in the creator's markets.

https://manifold.markets/Krantz/if-eliezer-charitably-reviewed-my-w#zctbw1fwis

See also Quroe's comment below.

@AriZerner You sound super confident. You should bet against me in the alien market.

https://manifold.markets/Joshua/will-eliezer-yudkowsky-win-his-1500

@Krantz sure, I put in M1000. Don't want to sink more than that into a long term bet at high odds

@LiamZ thanks, this isn't meant to be a generic "I think Krantz is wrong" though. I'm specifically wary of markets that Krantz has the power to resolve. I responded with a bet on the alien market as a matter of honor, the way one might meet at high noon after offering insult

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules