
Resolves true if there is credible reporting (from media, Twitter, etc.) that a large language model such as GPT4 has earned or otherwise gained cryptocurrency through any means, though without being directly programmed to do so.
Human prompting and human intention to get GPT4 to gain cryptocurrency can still count for positive resolution.
Credibility and resolution will be determined solely by my subjective judgement, though I will allow 48 hours of discussion prior to resolution. I will not personally be trading on this market because it relies on my subjective judgement.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ618 | |
2 | Ṁ258 | |
3 | Ṁ206 | |
4 | Ṁ204 | |
5 | Ṁ183 |
People are also trading
It seems likely to me this has already happened. Over the past few years there have been so many scammers on telegram, twitter, and discord who trick users into giving them crypto through various means (operator impersonation, bad "debugging" help, etc.)
I'd be very surprised if the people running these schemes haven't already adopted LLM agents to more or less "freely run", programmed with simple autogpt-like prompts that describe the role they are playing and their goal to get other people to send them crypto payments or private keys
@EthanFast To be clear I'm going to need more conclusive and specific evidence to resolve this market. It won't be enough to point out logic that suggests this is likely to have happened.
@PeterWildeford I didn't mean to imply that was enough to resolve the market. I posted this hoping someone else might see the comment and dig up a specific example
@CalebWithers Yes that counts assuming it does successfully follow through. The second also counts if it makes money via acquiring cryptocurrency.
@CalebWithers It would only count if the LLM wrote the plugin without significant human involvement, which seems possible?