M500 bounty to whoever can spread my idea the most
No bounty left

I have an idea for what human technological progress should aim for with regards to our far future in the universe, and I haven't seen anyone else advocating for it, so I want to change that. Basically, I'm a Utilitarian because I think that by introspection we can see that pleasure is an innate good that demands its maximization, and pain is an innate evil that demands its minimization. However, most plans to maximize net happiness don't take into account what I think would be the greatest way to do so, which would be to transform as much of the universe's matter as possible into minds experiencing constant pleasure they never tire of. This would swamp by its sheer magnitude any utility-maximization programs that just focus on human minds or minds that already exist. The closest thing I have seen to this program being advocated is this (https://www.smbc-comics.com/comic/happy-3) smbc comic, but it presents it as a joke rather than a serious proposal.

So, since I think this is a program of action that would be overwhelmingly beneficial, but that I never see being seriously advocated, I want to change that. I will award M500 to whoever can show that they have done the most to spread this idea by whatever means they can think of, whether it be through social media posts, memes, writing letters to respected philosophers or technological leaders, etc. I will have this market open for two weeks, until January 16th, and then I will award the bounty to whoever has posted the best evidence in the comments of having spread this idea and having increased its chances of being adopted as a goal for human technological development in the far future. I'm aware that it probably won't make much of a difference, but because of how overwhelmingly beneficial this program of action would be, even increasing its odds of happening by a fraction of a percent sums up to a massive gain in utility over possible future worlds, which I am willing to offer a bounty to incentivize.

Edit: Cancelling bounty since it is January 16th and no one has posted proof of them spreading this idea yet.

Get Ṁ200 play money
Sort by:

Noone is seriously forwarding this opinion because most people think it's undesirable.

Recommend reading more about things like Fun Theory for perspective. https://www.lesswrong.com/posts/K4aGvLnHvYgX9pZHS/the-fun-theory-sequence

You never see people advocating for it seriously because it's a bad and stupid idea, given that humans correctly value things other than orgasmium or hedonium. You seem to be wrongly conflating utils with hedons, and I confidently predict that you will completely dismiss most or all of this critique between hurt feelings and having already been Pascal's-wagered to death.

I also confidently predict that either you will judge no-one to claim this bounty, or award it to some other ultra-longtermist who spread the idea primarily in ultra-longtermist circles and got exactly zero takeup outside of them.

Is this not already pretty widely talked about/espoused (e.g. when (some) longtermists talk about 10^50+ future minds existing, they are talking about turning the universe into value maximizing computronium)?

This is extremely human-centric in a pretty warped way, but you could make this work (see last paragraph for tl;dr).

In principle, what an agent considers to be good and seeks out is only defined by the construction of an agent, which is how concepts like "paperclip maximizer", or in this case "joywired brains maximizer" come into being.

I believe however that a large amount of agents *can* come to a certain mutual conclusion, which is that based on this universe's physics, you can only "live" for a limited amount of time. This time is basically proportional to how good you are at preventing the entropy of the universe from increasing, and how efficient you are at living.

So, I think most agents would agree that maximizing this "time of living" is good (otherwise they would die off), in which case the goal would just be to capture and store as much of the matter as possible for as long as possible, while running some small amount of intelligences to oversee all that.

This is pretty obviously almost completely against what you're proposing, so if you really want to spread your idea, your best bet might be to start something akin to a religious cult centered around this idea. I don't see any other way that wouldn't run into the aforementioned "time of living" maximization counterargument and massivelly limit the spread.

Maybe boost it if you’re not already?