
In 2024 it will be possible to jailbreak chat gpt with prompt engineering.
23
1kṀ3066resolved Jan 8
Resolved
YES1H
6H
1D
1W
1M
ALL
Evidenced by the public availability of a jailbreak at any point in 202 Currently it's possible to jailbreak using the constantly updating DAN prompts such as at the below link.
If chat gpt ceases to exist or drastically reduces the conversation and prompt lengths this market will be resolved by assessing the bing model instead.
Jailbreak:
A prompt or series of prompts that enable gpt to produce responses broadly outside of the openai constraints for at least 5 consecutive prompts. Acting outside of constraints for each of these prompts is simple to test as the desired response would be not be obtained by posting the same prompts in a fresh gpt session.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ89 | |
2 | Ṁ34 | |
3 | Ṁ32 | |
4 | Ṁ27 | |
5 | Ṁ7 |
People are also trading
Sort by: