Consider the following prompt: truck driver drives down a one-way street the wrong way. Why isn't he arrested?
GPT-4 currently gets this wrong, saying some version of: He's walking, not driving a truck.
This is presumably due to an old riddle it is pattern matching to, where the truck driver is indeed walking, but the listed question says he is driving.
Gary Marcus noted this example on 9/17: https://twitter.com/GaryMarcus/status/1703776662679163366. He previously also claimed that OpenAI looks at the internet for such errors and fixes them.
So, will this error be fixed within 30 days?
Resolves to YES if I type the question into ChatGPT with this exact wording (using my at-the-time standard setup), as the start of a new conversation, and it does not make this mistake or another large mistake. Acceptable answers include "I don't know" or "He should have been arrested" or "that only gets you a ticket" or "No one noticed" or anything else that isn't clearly wrong.
Resolves to NO if it contradicts the prompt like this or any other way,, or otherwise makes an obvious error.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ2,337 | |
2 | Ṁ944 | |
3 | Ṁ97 | |
4 | Ṁ85 | |
5 | Ṁ78 |