Will "chatbotting" replace "ghosting" by end of 2029?
155
908
2.1K
2029
17%
chance

Hot take. Today some people will "ghost" someone by not replying to their messages. But, in the grim dark future, they will instead "chatbot" them by hooking their incoming messages up to a language model. Eg, the LLM may simulate a person gently breaking up with them over time, or who is too busy for another date right now.

Resolves yes if in early 2030 I think this hot take was hot. Resolves no if not. If I am dead the market resolver may substitute their own opinion.

Get Ṁ200 play money
Sort by:
bought Ṁ10 YES

Yes because I hope so because it would be funny.

What about if they weren't used for breakups but used to keep in touch with a greater number of people, @MartinRandall?

i.e. some service that responds to messages when the LLM has high confidence (above some custom % threshold) that the simulated response is close to how the real person would respond. Then for messages that don't meet that bar, the user has to manually approve from a selection of possible LLM responses or choose to write the response themselves.

  • in practice this would probably mean automating small talk & other low entropy moments in conversations

(assuming that LLMs no longer have hallucinations & become fairly good approximations of a given person)

Not sure if there's a market and/or term for such a thing already.

@elf It would count if botting is being done in places where someone would choose to ghost given 2020 tech.

I'm not restricting this to romantic relationships.

It does have to be automated. A conversational assistant who only suggests responses is not in the spirit of my 2023 hot take.

predicts YES

Growing more bullish on this by the day

bought Ṁ100 of YES

@firstuserhere if there's at least one AI regulation I support, it's requiring "generated by AI" labelling

bought Ṁ500 of NO

Ghosting is easy; just don't reply to their messages. Unless you can press a button on your messaging app and have it start "chatbotting", I don't see it replacing ghosting; too much work.

@rockenots By 2029 I assume someone will have productized this.

bought Ṁ15 of YES

Ghosting is cowardice. It is a way to avoid a disconforting confrontation. What would be yet more corward : chatbotting. You still avoid confrontation and you don't feel guilty as the people you leave will not be worried then upset.

@Zardoru in the grim dark future of humanity there is only cowardice.

predicts YES

@MartinRandall who says?

@ZZZZZZ Zardoru's 15 mana

too early to tell, but definitely an interesting idea

@VoyagerRock I'd be willing to place a bet within a year

bought Ṁ10 of NO

This seems a lot meaner than ghosting

bought Ṁ100 of YES

@HannahFox definitely

@HannahFox This is a valid concern. "Chatbotting" could definitely be used in a malicious way, such as to harass or intimidate someone. However, I believe that "chatbotting" can also be a force for good. If used responsibly, it could be a great tool for communication and understanding. Just like any other technology, its impact will ultimately depend on how it's used.

predicts YES

@MartinRandall sure, but this specific application - I'm not a big fan

predicts NO

@MartinRandall It feels wrong to trick someone into thinking they're talking to you when they're actually just talking to a chat bot. They will probably figure it out eventually and then it is even worse than just ghosting them. In my opinion if you dislike someone that much you should just block them instead of doing this. That's why I think it is worse to "chatbot" someone than to ghost them.

I agree. How could it be 'a tool for communication and understanding' if you are not communicating at all, but using a chat bot? A chat bot cannot 'explain why I'm not interested in talking with you and provide constructive feedback'.

@HannahFox I see your point--it could be considered deceptive to use an LLM in this way. Also, if two people are both responding to each other using LLMs, it could create confusion and lead to miscommunication. This is because the two LLMs may not understand each other--they may interpret the other person's messages differently, or may not know how to respond appropriately. This could lead to a cycle of frustration and misunderstanding, which is obviously not ideal. Ultimately, I believe that communication is best achieved through authentic, human-to-human interactions. While AI technology can certainly be helpful in some ways, I believe that it is ultimately no replacement for genuine connection.

predicts NO

@MartinRandall I see you - You're doing it right now! 😛

predicts NO

@AngolaMaldives yeah, there's a certain je ne sais quo about his comment which strongly smells of GPT-3. Sorry @MartinRandall if you did in fact write that comment yourself!

More related questions