Will I call short-termists "shorty"?
22
228
100
resolved Oct 10
Resolved
NO

Longtermists believe we need to stop AI from killing us in the next 20 years.
Short-termists believe in eradicating malaria.

As someone who hates long terms, I propose "shorty" as a short term that means short-termist (or neartermist). I like it because it does not falsely imply that short-termists don't care about the long run future. While diminutive, I think it's actually a positive term - after all, there are several traditional ballads that wax poetic about how virtuous shorties are.


See also my previous market:

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ58
2Ṁ21
3Ṁ20
4Ṁ12
5Ṁ11
Sort by:

Alright I'm leaning more towards yes now because consensus seems to be leaning that way, but I wanted a larger sample size so here's a poll

predicted YES
predicted NO

Why not ST? Because they STmate it will take longer before the world will end than the LT's do.

ST is pronounced shaw tay obviously

ah, of course you suggest this, STone

predicted YES

@stone I think that's backwards. If you think the world will end in the short term then there is no long term and being a long-termist would be weird.

aaa this market resolves tomorrow and I still haven't decided :<

predicted YES

@Sinclair short-termist is difficult to pronounce, at least use neartermist

predicted YES

@AntonKostserau Or Shawtay. Only two syllables, and it's about time for it to come back.

How do you resolve if you decide to pronounce it "shawtay", is that also covered under this answer?

calling them "shawtay" counts as YES

As someone who finds the AI talk tiresome and does think malaria nets are pretty awesome, I'm all for it.

bought Ṁ20 of YES

We still have 20 more years to buy cool nets.

bought Ṁ40 of YES

Consider this commenta commitment to start calling non longtermists shorty if you do.

This is too high. I generally don't actually use words that other people don't use.

I'd bid it down to like 5% but I have a policy of not trading in my own markets

as someone who is skeptical of a lot of longtermist claims, I dislike "short-termist" as it implicitly accepts the longtermist frame; sort of like describing yourself as anti-life or anti-choice.

predicted YES

@Adam Now-ist

predicted YES

@BTE The best thing to improve is the moment when an AGI kills everyone. The second best thing to improve is today.

predicted YES

@MartinRandall or the moment humanity nukes ourselves into oblivion (which is far more likely than an AI killing everyone*).

*I would count an AI weapons system mistakenly starting nuclear war as an example of both of those - so what I'm saying here is mathematically equivalent to "the risk we nuke ourselves into oblivion without an AI's involvement is far more likely than the risk an AI kills everyone without using nukes".

Longtermists believe we need to stop AI from wiping out humanity in the next 20 years because it would prevent [very large number] of future humans from being born.
Short-termists believe we need to stop AI from wiping out humanity in the next 20 years because it would kill billions of humans who are currently alive.

FTFY :P

how does this resolve

@Lily resolves to YES if at market close I decide to unironically call shortermists "shorty".
I generally like to use words that are understandable, common, short, and which won't make people mad

Idk, but listening to the first few bars of "Replay" while imagining that Iyaz is talking about short-termists dealt me one die of mental damage, so thanks for that

Perhaps the thought that 1/5 of an average lifespan into the future is "the long term" is the cause of some of the problems that both factions are trying to fix.

bought Ṁ50 of YES

@IsaacKing That framing assumes the "AGI is likely to kill us all in the next 30 years or so" statement is correct or close to it, though. And since those people seem to dominate the faction that calls themselves "longtermist", my not accepting that premise and thinking we should focus on helping real people who exist at this point in time makes me a shorty, in this context. It's not really a timeframe thing persay.

predicted NO

@MattP > That framing assumes the "AGI is likely to kill us all in the next 30 years or so" statement is correct or close to it

No it doesn't? My statement has nothing to do with what the specific catastrophe is; it's just the observation that 20 years into the future should not be considered "the long term" by any reasonable person. The same would apply if the threat were a predicted ecological collapse, nuclear war, or any other bad thing that isn't happening right now this very moment.

True longtermism is stuff like "Every second we delay colonizing the universe, we forever lose access to 60,000 stars worth of negentropy."