It must be clear that this was the primary reason for their suicide and that they likely wouldn't have committed suicide otherwise.
It must be the case that they are committing suicide due to worries about the future consequences of advanced AI. It does not count if someone commits suicide due to the consequences of already-existing AI.
However, the fears do not have to be specifically about unaligned AGI. Any worries about future AI will suffice.
May, 2023 is month 0. If someone commits suicide due to fears of AI in June of 2023, this market will resolve to 1. For simplicity, I'll count months based on the standard Gregorian calendar (rather than doing something like counting one month as the average number of days in a month).
Resolves to 1000 if no one commits suicide due to fears of AI in the next 1000 months.
Does not resolve if someone commits suicide as a result of this market.