What does sentience mean?
Basic
4
Ṁ27resolved Jun 24
25%20%
Capacity to experience feelings and sensations
25%15%
Developing the concept of "self" which is independant of everything else, including ones objective, such that an entity can contemplate modifying and choosing it's own objective
25%1.6%
Does it have an apprehension of Truth?
25%1.6%
“the breath of life”?
62%Other
Respond in the answers/comments. I will resolve to the answers receiving the most support in the comments (number of positive comments, roughly speaking), probably weighted by the amount of support.
I'd also encourage people to comment proposing alternate criteria that they thing are more useful than sentience. The word "sentience" probably isn't very useful because there isn't agreement on what it means. so it would improve our understanding to replace ambiguous words with what we think they mean (https://www.lesswrong.com/posts/WBdvyyHLdxZSAMmoz/taboo-your-words).
See also: https://manifold.markets/jack/poll-is-lamda-sentient
Jun 17, 5:48pm: amount of support will be based on number of positive commenters (many comments from the same commenter do not count more)
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Role play GPT-3 agrees re: consciousness being necessary for sentience: https://twitter.com/togelius/status/1537531366602153984/photo/1
Btw, I’m more a nominalistic than a realist, so might have defaults and premises which are different
https://en.wikipedia.org/wiki/Nominalism
@jack I’d be surprised if people considered LaMDA self-aware based on that ability alone since it’s been a feature of the earliest chatbots.
Describing sentience is of course a tough task; though it being the ability to experience feelings/sensations comes across as a bit tautological to me, and doesn’t make the task much easier IMO.
Concepts like consciousness, qualia, self-awareness and sentience all seem rooted in the something similar - our comfort in applying it to some entities and not to others seems to arise from some attribute we ascribe to certain entities. My best guess is that it is based on us believing that the entity has a sense of self and agency (I do think animals have it).
@akhil I think usually people say self awareness is part of the definition of consciousness, and do not require it for sentience - I think animals can experience feelings/sensations without necessarily being able to conceptualize it as "I feel X". I think it's plausible to consider LaMDA, GPT-3, etc self aware to some degree because they can talk about themselves, but I don't think that means anything about their capacity to experience feelings/sensations.
@MartinRandall Yeah fair, I do see self awareness as a prerequisite to sentience: “I feel X” only holding meaning because of the “I”. Eg. describing an aggressive robot as feeling angry would seem like anthropomorphising.
A “self” enables the attribution of a feeling part i.e the entity has the capacity to observe it’s own inner response and reward/penalty functions from that vantage point; and possibly do something to modify it.
And apprehending these things about reality in a direct and true way. Not that they need to cognitively know what a lie is, or even have the capacity to “believe” things; some sort of process of interfacing with the real and a detection of something about it. (It’s occurring to me, for the second time, that for us I’m mostly talking about the senses 😅. Funny how that works)
@Angela Developing, as in it is an observable phenomenon by another sentient being - in this case us, humans. To determine if something is sentient would involve asking whether an entity has a concept called "self" - a concept I'm defining via exclusion: "What is left when everything else is excluded".
One heuristic to answer this question would be whether that entity is observed to be modifying and choosing it's objectives. The premise being that excluding it's own objective function would be the last and final step of exclusion, leaving behind "self". And once something is excluded, it may be possible to interact with it in order to change it i.e. agency.
What counts as "modifying or choosing it's objectives" is of course not well-defined, and prone to recursion - but I think it might be easier to answer than "what is sentience".