🐕 Will A.I. Be Able to, "Feel and React to Pain," Significantly Better By the End of 2024?
closes 2025


Please read the preface for this type of market and other similar third-party validated AI markets here.

Third-Party Validated, Predictive Markets: AI Theme

Market Description

The temporal lobe (6) is an area of the brain which has been identified as being associated with Pain, Hunger, Flight or Flight Response. As far as I'm aware at the time of authoring this, there hasn't been a lot of recent, concentrated research into AI that can actively, "feel pain," and either signal a pain response on the simple end, or react and attempt to self repair on the more advanced end.

Obviously this is a capability which is likely already built into a lot of machines, e.g., your cell phone operating in low power mode when the battery goes low. But given the hightened interest in A.I., one wonders if some researchers are going to delve into this topic more closely?

Market Resolution Threshold

  • This is a bit of a trickier one to start off with, as the author might just be ignorant on the topic so it might be more difficult for me to just draw a hard line at the onset.

  • However, I think it's better to find a third-party source with a discrete score that we can more or less agree upon and use that as a market resolution criteria. For the first six months to a year of this market, we should work collaboratively to find a measurement criteria which is reliable but puts us more on the blind side of the outcome.

  • After that point, a numerical threshold will be chosen and that will be used to settle the resolution.

  • Typically I use about 1.3*X where X is the original score at the time of market creation as the definition of, "Significantly better," but the factor of 1.3 might need to be higher or lower, depending upon the scoring system used, and how we can land on more of a 50/50 rather than a sure thing.

  • Should no numerical threhold be found, this bet will resolve N/A, potentially sooner than the market resolution date so as not to tie up people's capital further.

Get Ṁ500 play money

Related questions

In 2028, will AI be at least as big a political issue as abortion?
ScottAlexander avatarScott Alexander
34% chance
Will AI be a major topic during the 2024 presidential debates in the United States?
MatthewBarnett avatarMatthew Barnett
29% chance
Before 2028, will any prediction market come up with a robust way to run a market on AI extinction risk?
IsaacKing avatarIsaac
33% chance
Will AI pass the Longbets version of the Turing test by the end of 2029?
dreev avatarDaniel Reeves
53% chance
Will Biden sign an executive order primarily focused on AI in 2023?
SG avatarS G
55% chance
Will an AI get gold on any International Math Olympiad by 2025?
Austin avatarAustin
30% chance
Will Tyler Cowen agree that an 'actual mathematical model' for AI X-Risk has been developed by October 15, 2023?
JoeBrenton avatarJoe Brenton
9% chance
Will >$100M be invested in dedicated AI Alignment organizations in the next year as more people become aware of the risk we are facing by letting AI capabilities run ahead of safety?
BionicD0LPH1N avatarBionic
81% chance
Will Bostrom's "Superintelligence" exceed its current popularity peak before 2028?
Metastable avatarMetastable
21% chance
Will AI outcompete best humans in competitive programming before the end of 2023?
Will there have been a noticeable sector-wide economic effect from a new AI technology by the end of 2023?
Nostradamnedus avatarNostradamnedus
13% chance
Will anyone very famous claim to have made an important life decision because an AI suggested it by the end of 2023?
IsaacKing avatarIsaac
22% chance
In a year, will I think that risk of AI apocalypse is between 1 and 10%?
NathanpmYoung avatarNathan Young
52% chance
🐕 Will A.I. Be Able to Make Significantly Better, "Common Sense Judgements About What Happens Next," by End of 2023?
PatrickDelaney avatarPatrick Delaney
41% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
NathanNguyen avatarNathan Nguyen
50% chance
Will AI replace over 50 million jobs by end of 2024?
getby avatarI get down
12% chance
Will the left/right culture war come for AI before the end of 2023?
LarsDoucet avatarLars Doucet
5% chance
Will Science's Top Breakthrough of the Year in 2023 be AI-related?
dp avatardp
40% chance
Will AI be a Time Person of the Year in 2023?
🐕 Will A.I. Get Significantly Better at Evaluating Scientific Claims by the end of 2024?
PatrickDelaney avatarPatrick Delaney
46% chance
Sort by:
RobinGreen avatar
Robin Green

AIs are philosophical zombies and cannot feel pain. If you don't believe me, ask GPT-4 if it can feel pain.

firstuserhere avatar

Able to feel and react to pain


Able to understand and signal its expected response to pain

Pain in biological organisms evolved as a signal mechanism and we evolved through evolutionary layers one of which being the competition or often even "survival of the fittest" type systems where pain is helpful and provides meaningful information.

We hand select/craft current ai systems and the evolution/selection through competition follows a criteria of benchmark performance, revenue growth, etc etc (the Deep learning ensembles then scaled transformers with funky optimisations win currently but is there a DIRECT feedback back? The current selection seems to be by proxy) which don't have "pain" as i understand it to be a part of the feedback loop

1 reply
firstuserhere avatar

@firstuserhere oh i just wrote out as i thought. If it doesn't make sense, feel free to ask me to reformat and structure.