Do you think Bing's "Sydney" chatbot was sentient when it first went live?
220
resolved Jan 14
Yes
No
Take your best guess if you're unsure.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Intelligent, maybe, one could even probably argue that, in a broad set of task, it's better than a lot of humans.
But a transformer architecture, predicting the next token given some input, being sentient.. that just doesn't make any sense.
Being close to solve "the imitation game" doesn't make something sentient.
You may even be an illusionnist and think that there is no sentient beings at all.. but in this case, Sidney is not sentient too, and I see no world where an illusionnist would classify monkeys as having by abstraction the same psychological machinery as GPT ?? 💀
Related questions
Related questions
Will it be easy to conjure Sydney chatbot on Bing or other platforms by January 2025?
49% chance
Is Bing Chat conscious?
18% chance
Will there be another blatant demonstration of AI risks, comparable to Bing Chat, by 2024?
30% chance
Why is Bing Chat AI (Prometheus) less aligned than ChatGPT?
Will ChatGPT or Bing be the most popular LLM chatbot at the end of 2024?
76% chance
Will Bing's chat model get shut down before 2024?
13% chance
By 2026, will a proeminent chatbot with some access to the internet do something actually harmful and unintended?
68% chance
Will Google's AI chatbot (Bard) contain ads/sponsored links before 2025?
20% chance
Will it be revealed by 2030 that Bing Sydney's release was partially a way to promote AI safety?
5% chance
Will Bing Chat be the breakthrough for AI safety research?
9% chance