While reading Project Lawful (fiction in dath ilan setting), I've come across concept of algorithmic commitments (which seem to work if they were deployed in real life) and can't find any other mentions of them.

They were introduced as bots for prisoner's dilemma, defined as functions (other bot -> decision):

let CooperateBot(_) = Cooperate

let DefectBot(_) = Defect

let FairBot (X) = if Provable( "X(Fairbot) == Cooperate" ) then Cooperate else Defectlet Provable-1("X") = Provable(" ~Provable(0=1) -> X")

// Provable-1("X") = X is provable if the proof system is consistent

let PrudentBot(X) = if (Provable("X(PrudentBot) == Cooperate") && Provable-1("X(DefectBot) == Defect")) then Cooperate else Defect

// the bot cooperates if and only if the other will cooperate with him but defect against DefectBot

I offer a bounty of 20-40 mana per example of algorithmic commitment substantially different from this one (according to my own judgement).