Asking clarifying questions is encouraged. I will attempt to resolve objectively.
YES if there is a reasonable amount of evidence that LLMs experience qualia, NO if there is a reasonable amount of evidence that LLMs do not experience qualia.
If some LLMs experience qualia and others do not, the question will resolve YES.
Qualia can be any sort of qualia (perception of color, emotion, or even something humans cannot experience).
Resolution does not require absolute proof, but reasonable evidence. For example, the endosymbiosis theory of mitochondria origin would resolve YES. Panspermia or mammalian group selection would not resolve (at the present moment).
Any LLM shown to experience qualia counts. If some LLMs experience qualia and others do not, the question will resolve YES. AlphaZero would not count as an LLM. Reasoning models and large concept models would count as LLMs.
Update 2025-07-19 (PST) (AI summary of creator comment): The creator has clarified that they will not resolve this market based on consensus alone; it will be only one of multiple aspects they assess.
Update 2025-07-20 (PST) (AI summary of creator comment): The creator has clarified the definition of an LLM for this market:
The presence of a transformer architecture is not sufficient on its own for a future AI to count as an LLM.
An AI with a Llama-style architecture trained only on fMRI data would not count.
An AI with a recognizable LLM architecture that is simply larger or uses a different optimizer would count.
Update 2025-07-20 (PST) (AI summary of creator comment): The creator has clarified that the market could resolve YES based on an AI developed in the future, even if it is proven that no LLM existing today has qualia. Such an AI would need to be recognizable as an LLM by today's understanding.
Update 2025-07-20 (PST) (AI summary of creator comment): In response to a hypothetical question about whether other humans experience qualia, the creator stated it would resolve YES. Their reasoning was based on a combination of factors including:
Shared biological architecture
Signaling of qualia
Universal acceptance
Update 2025-07-22 (PST) (AI summary of creator comment): The creator has clarified their standard for 'reasonable evidence':
Resolution will be based on a convergence of evidence for or against LLM qualia.
The threshold could be met when explaining all the evidence away becomes a notably less parsimonious explanation than simply accepting the conclusion (either YES or NO).
Update 2025-07-23 (PST) (AI summary of creator comment): The creator has specified that the market will not resolve YES based on a model's claim to have qualia as the sole piece of evidence. This is true even if the claim is repeated and unprompted.
Update 2025-07-23 (PST) (AI summary of creator comment): The creator has clarified that this market is about qualia, not consciousness.
The market would resolve YES if an LLM were found to experience qualia even without being considered conscious.
People are also trading
@ohiorizzler1434 To be clear this is specifically about qualia, not consciousness. If an LLM somehow experiences qualia without consciousness then that would still support YES.
@MaxE Can I improve the resolution criteria? Do you have any questions/ clarifications / criticisms?
@SCS I disagree with the idea that we can clearly define a class of physical systems that have experiences. I didn't mean for it to sound like you did something wrong on the market side.
@MaxE also, I worry that you will resolve it yes because a model says it does with no other evidence. That is why we believe other humans are sentient, after all, and I don't know how much you anthropomorphize LLMs.
@MaxE I will not resolve yes based on a model claiming to have qualia (even without guidance, and even repeatedly) with no other evidence. Also, this will not resolve if there is no evidence for or against.
@NivlacM This is a good question. Though we don't have mature theories of why qualia emerges, the fact that other humans share our own biological architecture and self-report and signal qualia means we can be confident that other humans do have qualia just like our self. Additionally, it's universally accepted, and your own experience serves as an existence proof that at least one human experiences qualia.
Therefore it would resolve yes but for reasons that might not be applicable.
The market question allows moving goalposts.
We accept that other people experience qualia because it is the simpliest model, not because it is proven in any way. It cannot be proven until there is measurable definition for qualia, and there is none such definition.
@Henry38hw
The description says:
"Resolution does not require absolute proof, but reasonable evidence."
I am referring to the definition here, though of course current techniques can't measure it perfectly:

@SCS you can't have evidence until the definition is strict.
That is like checking whether "person A subjectively loves person B" by only looking at their actions and/or MRI. But if you have a struct definition of love as for example chemical concentration shifts and specific brain activity, then you can have some evidence.
If no measurable definition is provided, then any "evidence of love" can be neglected: "A is just friendly towards B", "A has some strategic interest in those actions".
@Henry38hw The "proof" aspect wasn't for other people experiencing qualia, but any person experiencing qualia. Since you are a person, and you experience qualia, this proves at least one person experiences qualia.
@Henry38hw Since we are unlikely to have a single, definitive qualia-meter, resolution will be based on a convergence of evidence for or against LLM qualia. The threshold for "reasonable evidence" is when explaining all the evidence away becomes notably less parsimonious explanation than simply accepting the conclusion (in either direction). See the example in the description of mitochondria endosymbiosis.
@SCS here is the confusing part:
Since you are a person, and you experience qualia, this proves at least one person experiences qualia.
... you experience qualia ...
You take it as a fact, before it is proven or has any evidence towards it.
It looks like the bible paradox. "Bible is true, because it is the word of god. It is the word of god, because that fact is written in The bible".
What is the point of the proof, if it is self-referencial?
@Henry38hw It's more through observation. For example, a market saying "there are no red apples" would resolve NO if I observe in my hand a red apple. Similarly, a market saying "at least one human experiences qualia" would resolve YES because I observe that I am experiencing qualia right now (e.g., able to perceive the red of an apple).
@SCS but you are talking about OTHER entity, specifically about LLM.
The history has no examples of proving any third person qualia.
Your last example with "i onserve a red apple" would only be suitable, if you (market creator) are an LLM, who will resolve the market by itself, after experiencing qualia first-person.
@Henry38hw Exactly, so that wouldn't be used to resolve yes. (Since the question isn't about humans.)
@Henry38hw "Your last example with "i onserve a red apple" would only be suitable, if you (market creator) are an LLM, who will resolve the market by itself, after experiencing qualia first-person."
Which would be sick btw
Let's say it's the year 2034, and we've unraveled the mysteries of consciousness.
Neuroscientists and cognitive philosophers agree that a) no LLM that existed in 2025 experienced qualia and b) the first AI to experience qualia was built in 2033 and it incorporated transformers as an integral part of its architecture.
Does this resolve yes?
@JustKevin Unlikely. Transformers are not sufficient. For example, Llama-style but trained on fMRI data alone would not count. I would expect larger architecture changes to exist by 2033 too. However, if what they trained in 2033 is clearly an LLM (e.g. Llama architecture) but they needed way more parameters, or merely a different optimizer, then that would count.
@SCS So to clarify, this question could resolve 'yes' even if it is demonstrated that no LLM that currently exists today experiences qualia?
@JustKevin technically possible, yes. But it would be recognizable as an LLM by today's understanding.