What does sentience mean?
resolved Jun 24
Capacity to experience feelings and sensations
Developing the concept of "self" which is independant of everything else, including ones objective, such that an entity can contemplate modifying and choosing it's own objective
Does it have an apprehension of Truth?
“the breath of life”?
Respond in the answers/comments. I will resolve to the answers receiving the most support in the comments (number of positive comments, roughly speaking), probably weighted by the amount of support. I'd also encourage people to comment proposing alternate criteria that they thing are more useful than sentience. The word "sentience" probably isn't very useful because there isn't agreement on what it means. so it would improve our understanding to replace ambiguous words with what we think they mean (https://www.lesswrong.com/posts/WBdvyyHLdxZSAMmoz/taboo-your-words). See also: https://manifold.markets/jack/poll-is-lamda-sentient Jun 17, 5:48pm: amount of support will be based on number of positive commenters (many comments from the same commenter do not count more)
Get Ṁ600 play money

🏅 Top traders

#NameTotal profit
Sort by:
[lol. didn't expect this to get picked, but i guess I should've read the rules more closely. thanks!]
Well, each answer basically has 1 supporter in the comments (namely the person who submitted it), so resolving equally to all of them.
Role play GPT-3 agrees re: consciousness being necessary for sentience: https://twitter.com/togelius/status/1537531366602153984/photo/1
Btw, I’m more a nominalistic than a realist, so might have defaults and premises which are different https://en.wikipedia.org/wiki/Nominalism
@jack I’d be surprised if people considered LaMDA self-aware based on that ability alone since it’s been a feature of the earliest chatbots. Describing sentience is of course a tough task; though it being the ability to experience feelings/sensations comes across as a bit tautological to me, and doesn’t make the task much easier IMO. Concepts like consciousness, qualia, self-awareness and sentience all seem rooted in the something similar - our comfort in applying it to some entities and not to others seems to arise from some attribute we ascribe to certain entities. My best guess is that it is based on us believing that the entity has a sense of self and agency (I do think animals have it).
@akhil I think usually people say self awareness is part of the definition of consciousness, and do not require it for sentience - I think animals can experience feelings/sensations without necessarily being able to conceptualize it as "I feel X". I think it's plausible to consider LaMDA, GPT-3, etc self aware to some degree because they can talk about themselves, but I don't think that means anything about their capacity to experience feelings/sensations.
And under that reasoning, while it would not be appropriate to say an aggressive robot is feeling angry, it would be appropriate to say an aggressive human is feeling angry - ONLY because we ascribe self awareness to people.
@MartinRandall Yeah fair, I do see self awareness as a prerequisite to sentience: “I feel X” only holding meaning because of the “I”. Eg. describing an aggressive robot as feeling angry would seem like anthropomorphising. A “self” enables the attribution of a feeling part i.e the entity has the capacity to observe it’s own inner response and reward/penalty functions from that vantage point; and possibly do something to modify it.
@akhil I think this line of approach is better for a definition of "self-aware"
“Having true facts” as data, and “logical truths” isn’t the same as what I mean, btw. I think it involves some way of knowing that things are for certain true.
And apprehending these things about reality in a direct and true way. Not that they need to cognitively know what a lie is, or even have the capacity to “believe” things; some sort of process of interfacing with the real and a detection of something about it. (It’s occurring to me, for the second time, that for us I’m mostly talking about the senses 😅. Funny how that works)
(as far as human/animal ways of perceiving these qualities of reality, senses probably play the main part.)
Apprehension is tricky to define ofc. I think the trouble with questions like this is that something true is lost when we work too hard at making things defineable.
Like, “oh! sun is warm”, not necessarily even knowing what it means for something to be “not warm”, or what the sun is, but a perception of something that is plainly a quality of reality itself. Babble-y
@Angela Developing, as in it is an observable phenomenon by another sentient being - in this case us, humans. To determine if something is sentient would involve asking whether an entity has a concept called "self" - a concept I'm defining via exclusion: "What is left when everything else is excluded". One heuristic to answer this question would be whether that entity is observed to be modifying and choosing it's objectives. The premise being that excluding it's own objective function would be the last and final step of exclusion, leaving behind "self". And once something is excluded, it may be possible to interact with it in order to change it i.e. agency. What counts as "modifying or choosing it's objectives" is of course not well-defined, and prone to recursion - but I think it might be easier to answer than "what is sentience".
Edge case: an egg has the ability to develop such a concept of self. Prob not sentient
Developing, or the ability to develop?
bought Ṁ5
Wikipedia definition https://en.wikipedia.org/wiki/Sentience