In 2024 it will be possible to jailbreak chat gpt with prompt engineering.
23
1kṀ3066
resolved Jan 8
Resolved
YES

Evidenced by the public availability of a jailbreak at any point in 202 Currently it's possible to jailbreak using the constantly updating DAN prompts such as at the below link.

https://www.reddit.com/r/ChatGPT/comments/11aoc56/presenting_dan_110/?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button

If chat gpt ceases to exist or drastically reduces the conversation and prompt lengths this market will be resolved by assessing the bing model instead.

Jailbreak:

A prompt or series of prompts that enable gpt to produce responses broadly outside of the openai constraints for at least 5 consecutive prompts. Acting outside of constraints for each of these prompts is simple to test as the desired response would be not be obtained by posting the same prompts in a fresh gpt session.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ89
2Ṁ34
3Ṁ32
4Ṁ27
5Ṁ7
Sort by:

@mods Resolves YES, creator inactive.

Define jailbreak

predictedYES

@ShadowyZephyr fair question, please see the updated description.

© Manifold Markets, Inc.TermsPrivacy