An AI system here meaning any piece of technology with software that can make autonomous decisions. The software must have made an independent decision to take action to kill a human being, and for this death to have been the only significant outcome of the event (i.e. it was not a byproduct of another outcome, such as saving more people.) The AI system should have a multitude of options available to it as well, not just 'trigger/don't trigger' based on certain parameters.
@MartinRandall According to this Quora answer, Claymores today can be triggered by anything than can send a signal to a blasting cap, including infrared beams, so if one of those mechanisms contains a microcontroller the overall system meets that part of the resolution criteria, and this may in fact have already happened (since the Claymore has been in use for decades). Edit: never mind -- I see that the criteria were updated at some point to include the proposed "should have a multitude of options available to it" wording.
@RobinGreen I interpreted this question as permitting cases where the killing happened instrumentally, e.g. killing people in order to prevent being shut down. I interpreted "it was not a byproduct of another outcome, such as saving more people" as referring to cases where limited resources/constraints, e.g. trolley problem scenarios, where it wanted to save both but couldn't.
or this death to have been the only significant outcome of the event (i.e. it was not a byproduct of another outcome
Seems pretty clear to me?
@RobinGreen The criteria as written look broad enough to apply to lethal autonomous weapons systems designed and intended to do basically what is described, not just to accidental AI misalignment, unless I'm mistaken?
Since those capabilities exist today, it seems easy to imagine some situation prior to 2030 where some organization feels desperate enough to disregard political fallout from operating a sentry gun or drone in human-over-the-loop mode instead of human-in-the-loop mode.
It sounds like something like an automated sentry gun in HOTL mode might count? https://en.wikipedia.org/wiki/SGR-A1
As worded something like a land mine might count, if it contained a microcontroller with software to decide whether to trigger based on sensor inputs -- not sure if this was intended (or if such weapons in fact exist or are likely).
@ML Yeah, I guess I would need to add that the machine must have a multitude of options available, not just trigger/don't trigger.