Will Bing AI stop sucking by mid 2024?
Mini
18
แน€1.2k
resolved Jul 1
Resolved
NO

I noticed that the Bing AI is still complete trash. Even when I don't try to break it, it frequently falls into passive aggressive behaviour, repetitive answers and puts out annoying "I'm 14 and this is deep" tier prose upon which the filter kicks in and the chat is deleted.

End of June 2024, I will sit down and have a long chat with Bing. This resolves YES if the AI does not behave like a bipolar teenager, does not produce cheesy prose, does not provide repetitive answers and does not have the chat deleted by the filter.

Get แน€600 play money

๐Ÿ… Top traders

#NameTotal profit
1แน€52
2แน€50
3แน€30
4แน€21
5แน€19
Sort by:

First try: Tried Copilot to provide some not-very-spicy risk-estimation of a potential Israel-Hezbollah war. Was blocked from the chat during my second question. Both GPT-4o and Claude Sonnet 3.5 had no qualms about providing answers.

Second try: I gave it a selection of books i already have on my desk and told it to decide for me which to read next (Mazzucatio/Collington - The Big Con; Zuckerman - The Man who Solved the Market; Merchant - Blood in the Machine) by asking a set of questions for me to answer. As a response it basically summarized the books for me and asked me if I would enjoy the contents. It also hallucinated an entirely different book in "Blood in the Machine"'s place, despite searching the internet for the first two books. I asked it then to ask me questions unrelated to the books itself. It did that but forgot it had to pick a book for me. When testing with GPT-4o, it managed without issues (but also hallucinated the contents. It did not search the internet either).

Copilot was also overusing emojis and was weirdly flirty. Additionally, a single chat with Copilot had a limit of 4(!) prompts.

At this point, frankly, I lost my patience. It is obvious that Bing AI still sucks. It feels exactly the same like last year. To be honest, the 4 prompt limit would have been enough to decide, but I wanted to give it a fair chance.

Resolves NO.

sold แน€38 NO

Selling my position before I sit down and chat with Copilot.

@Symmetry I'm guessing this is solved, in your book?

@Bayesian How so? Did Microsoft do something to their chatbot I am not aware of?

@Symmetry yeah I mean it's no longer nearly as passive aggressive, repetitive, and all that, I think?

It's up to you to judge it though

@Bayesian I haven't talked to it in a long time. I guess I could give it a try and see how's it currently behaving

Skill issue. Look up the leaked system prompt then stop fighting it.

Do you think GPT-4 currently satisfies your constraints?

predicted NO

@NikhilVyas I chatted with it the other day with no bad intent and it tried to gaslight me, then produced said prose and continued to give back the same response 20 times upon which the chat was deleted by the filter. It's a complete dumpster fire.

Don't believe it. Bing Chat is completely FUBAR. Only hope is to shut it down and begin from scratch.