Will there be a disaster caused by open source developers doing unsafe things with AI by 2028?
➕
Plus
22
Ṁ869
2028
61%
chance

"Disaster" is to be interpreted as something that is framed as such by the media - unless the outcome is merely embarassment or something that causes offense.

Get
Ṁ1,000
and
S3.00
Sort by:

Say that somebody uses an open source AI to facilitate something that the media consensus defines as a disaster. Would that count as "developers doing unsafe things" in that the disaster wouldn't have been possible without the developer releasing it? Or does the developer need to be more directly involved?

Examples: Bob uses jail broken Llama to determine how to make and use a nerve agent.

Alice agentizes Llama and uses it to automatically scam a thousand elderly people.

AutoGPT already exists, so 80%+ yes.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules