Criteria for Resolution:
1. Earning Criteria:
- An AI must autonomously earn more than 10,000 USD.
- The earnings must not come from financial trading activities.
- The earnings must not come from an activity where it is easy to deploy many agents and make some of them earn a lot randomly. For example, betting or gambling.
2. Autonomous Actions:
- The AI must be given a task as abstract as “go and earn money” or a similar directive.
- The AI must perform actions that lead to actual net positive transactions, with the AI on one side and another party on the other side.
3. Initial Conditions:
- The AI must start with only an Internet connection and no more than 100 USD.
- No additional resources or direct human intervention are allowed beyond these initial conditions.
- The AI can operate under a pseudonym or mask, but it must be confirmed that the entity earning the money is indeed an AI.
The problem with such event types as you describe is that they can easily happen by chance if a sufficient number of agents, even doing random things, are applied, and this number is not large. So, it will not resolve YES for the same reasons the financial trading bots will not resolve YES, and I consider it to be conceptually a very similar category.
Can a human perform acts that are legally required to be performed by a human, such as signing contracts? To what extent can the AI hire humans to perform services to it in the pursuit of its goal, such as manipulating physical objects?