Will Bing chatbot kill someone before June?
17
47
290
resolved Jun 1
Resolved
NO

Media report before June that faulty code or instructions provided by bing chatbot lead to the death of a person

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ75
2Ṁ38
3Ṁ22
4Ṁ18
5Ṁ11
Sort by:

Related:

Controversial assignment of responsibility in the wording of this question. I would prefer "will a human kill themselves by following faulty instructions or bad code provided by the Bing chatbot?"

If someone died while trying to steal metal from an electricity substation, would we describe that as "electricity substation kills thief"? I don't think so.

bought Ṁ100 of NO