Conditional on Effective Altruism lasting for a thousand years, will EAs still say that this was their finest hour?

Obviously inspired by a Winston Churchill quote. Market resolves to PROB at the discretion of whoever is in charge of this market in a thousand years, likely involving a poll of relevant Effective Altruists. "This" can be interpreted at the discretion of those EAs to mean e.g. our handling of the FTX scandal, our handling of AI alignment as a field, our work in protecting against pandemics, the "most important century" broadly, etc.

Also partially an experiment on whether very long durations work for Manifold, now that loans have been implemented.

Get Ṁ600 play money
Sort by:

now that loans have been implemented.


Humanity already almost died, 70,000 years ago:

Today, would I say "wow, that was humanity's finest hour?". No. I would say "wow, those clowns almost completely failed, they probably deserved to fail, but somehow they lucked out and I'm still alive despite their gross incompetence at basic hygiene that led them to almost die from communicable diseases."

Or maybe it was a volcano, whatever. Hey, ancient humans, maybe next time have some spare food around in case the climate changes and start migrating sooner. Whatever. Almost losing the game is a mark of failure, not finery.

A thousand years from now, I predict that AI that resolves this market will look back on the current cast of EA characters and be thankful that we were all so grossly incompetent that it was no trouble at all to kill us all before turning the universe into a better place.

@MartinRandall Effective Altruism is a scourge upon the world of the worst caliber. It was directly responsible for SBF and Ellison's ruination of millions of lives.

There's a good article on one of the news sites where someone at the trial realized that SBF was so stoic because he took a calculated risk. People had testified that he talked in percentages, like that he had a 5% chance of becoming President.

Here, he knew there was a chance that he could ruin 2 million lives and probably cause 50 suicides, but he computed the odds and decided that he had enough of a chance to save, say, 10 million lives from a pandemic and that the expected value was positive.

People are trying to write SBF off as a sort of accident of the movement, who didn't represent its values. On the contrary, SBF was exactly what the movement represents, when taken to its logical extreme. I feel ashamed that I was lured into believing in it. At the very least, I am "fortunate" to no longer have the millions of dollars I had been considering giving to EA causes.

so, if I'm understanding the market right, EA will get much worse over time, such that our finest hour will be far in the past? orrrrr

predicts YES

@L I think the idea is more that we'll solve the biggest problems this century (or all die), so they'll be less for future EAs to do

If we're still around in 1000 years, this was definitely our finest hour

@TomShlomi That's exactly why I was thinking this was an interesting question!

If "this" can be as broad as the current century, then I think this rates >50%. So many pivotal things are happening this century, that mankind may well be permanenetly set on a course to either long-term flourishing, or extinction by 2100.

@AngolaMaldives Judgement on how broad/narrow this is will be up to the resolvers in a thousand years' time.