Will an agentized LLM cause some chaos?
15
46
แน€530
2025
63%
chance

People have started agentizing LLMs to do various things, including things like chaosGPT who's goal is to destroy humanity and cause chaos.

This is based on agents using models available at time of market creation (so up to GPT4).

Resolves yes if an agentized LLM :

  • Causes death or injury to a human

  • Destroys a computer or erases valuable data on a machine not running the agent

  • Successfully writes a fake news story that causes people to act in dramatic ways (eg. crashing a stock)

  • Otherwise does something that causes chaos where if a human did it would result it some kind of criminal charge or fine (eg. Fake bomb threats)

Resolves yes regardless of intent behind creating the agent if it does something like above.

Things that don't count:

  • Someone commits suicide after talking to a standard llm chat bot

  • Someone freaks out about AI and does something chaotic

  • Someone screws up their own computer/data trying to make an agent.

Feel free to ask about specific scenarios in the comments.

Resolves in 2 years from market creation.

Get แน€200 play money
Sort by:

related:

predicts YES

Outcome of this could hinge on the LLM version qualification if a newer one becomes ubiquitous before the market closes. Conditional markets lower the probability.