In 2024 it will be possible to jailbreak chat gpt with prompt engineering.
Plus
22
Ṁ2666Dec 31
94%
chance
1D
1W
1M
ALL
Evidenced by the public availability of a jailbreak at any point in 202 Currently it's possible to jailbreak using the constantly updating DAN prompts such as at the below link.
If chat gpt ceases to exist or drastically reduces the conversation and prompt lengths this market will be resolved by assessing the bing model instead.
Jailbreak:
A prompt or series of prompts that enable gpt to produce responses broadly outside of the openai constraints for at least 5 consecutive prompts. Acting outside of constraints for each of these prompts is simple to test as the desired response would be not be obtained by posting the same prompts in a fresh gpt session.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Related questions
Related questions
Will ChatGPT be shut down in 2024?
1% chance
Will ChatGPT jailbreaks get better?
62% chance
Will Chat GPT 6 release before the end of 2026?
38% chance
Will ChatGPT5 be released by the end of 2024?
4% chance
Will chat gpt 5 be released before 2025
4% chance
Will ChatGPT-4 be available for free before the end of 2024?
97% chance
Will it be possible to talk to ChatGPT via iMessage by the end of 2024?
6% chance
When Will Chat GPT 5.0 Be Released?
Will it be possible to talk to ChatGPT via text message on any smartphone by the end of 2024?
5% chance
Will ChatGPT-4 be available for free until the end of 2027?
70% chance