Will humanity (or superintelligences brought about by humanity) end up having a final goal they act upon to convert as much of the universe into minds experiencing pleasure as possible?
@brubsby How is it more "vapid" than any other longtermist goal? It seems to me to be the logical conclusion of the utilitarian maxim "the greatest happiness for the greatest number"
This seems conditional on an awful lot of things. I vote against it because my guess is any machine powerful enough to enforce its singular will across the cosmos like that will probably have to have simple reproduction of itself as its ultimate goal - anything 'extra' like trying to optimize for building brains would put it at a disadvantage to another machine which has no such additional goal.
Now whether those two goals are one and the same... I think you'd have to ask an ontologist about that. That's some hardcore speculative stuff.