How Friendly is capitalism? Does capitalism optimize for a subset of human values?
27
Never closes
Capitalism optimizes for all human values.
Capitalism optimizes for all human values, but not evenly.
Capitalism optimizes for the values of a subset of humans, and Goodharts the rest.
Capitalism optimizes for some other subset of human values, and Goodharts the rest.
Capitalism optimizes for a set of values that are alien to human values.
Capitalism is not an optimizer.

I kinda just want to see how polls work here.

Edited to add

For a very simplified model, imagine that humans value arbitrarily high amounts of wealth (W) and hugs (H). Human utility U = f(W,H) where f is monotonic in each of W and H.

  1. "Capitalism optimizes for all human values" means that capitalism maximizes U.

  1. "Capitalism optimizes for all human values, but not evenly" means that capitalism maximizes W and H, but perhaps not in the proportions that maximize U. It might be possible to get more U by making less W and more H.¹

  2. "Capitalism optimizes for the values of a subset of humans" means that capitalism might maximize your component of U, and maybe even the U of everyone you know, but not everyone's component of U.

  3. "Capitalism optimizes for some other subset of human values" probably means that capitalism maximizes W, but doesn't maximize H.

  4. "Capitalism optimizes for a set of values that are alien to human values" means that capitalism maximizes some function of X, Y, Z, etc.; where W and H are not functions of X, Y, and Z. (Here live the paperclippers.)

  5. "Capitalism is not an optimizer" means there's no usefully describable A that capitalism maximizes, other than maybe entropy.

¹ And as the old song says, "I don't care too much for W, W can't buy me H."

Get Ṁ1,000 play money

Related questions

Sort by:

Totally fucked up my vote

Anything based on voluntary exchange is capitalist by definition. The alternative is involuntary exchange, i.e., backed by some threat of violence. Maybe money can't buy you love, but guns can't buy you love either. Wealth redistribution can be nice in some circumstances, helping genuinely needy people move up Maslow's hierarchy, and the society still counts as mostly capitalist for reasonably small levels of redistribution. As AI improves productivity, the fraction of output that will need to be redistributed to meet people's basic needs will get smaller exponentially.

as preferences and meta-preferences are subject to evolutionary pressures, eventual re-emergence of Malthus is inevitable at any level of economic output, such that the ASI backed welfare utopia which respects individual reproductive autonomy automatically produces the repugnant conclusion, except the sign of U might be negative due to the reemergence of scarcity.

You write: "Anything based on voluntary exchange is capitalist by definition."

In the sense of "capitalism" I'm using here, voluntary exchange is many millennia older than capitalism. Capitalism came about with the invention of institutions such as joint-stock corporations, capital markets, and so on. See for instance Wikipedia's history of capitalism.

@JonathanRay As AI improves productivity it will capture a larger share of all output therefore the amount needed to be distributed will increase not decrease.

Markets and maybe property are the only optimizing forces in capitalism and most of the other distinctive forces just short circuit these under specific circumstances. The short circuiting processes are themselves recursive and based on hawk dove and predator prey dynamics, rewarding aggression and ultimately creating lopsided dynamics in the way most predator prey population models do, which in this case destroy markets and property, leading to conditions under which they have to be redeveloped, before capitalism starts happening again, repeating the cycle. Because short circuiting reduces complexity and increases legibility and homogeneity and because it happens most acutely at the apex of a given civilization most histories attribute prosperity to the conditions that cause collapse and collapse to the factors that actually come post-collapse. If you want to know what not to do to preserve civilization, don't ask people who held power or were adjacent to power when previous civilizations collapsed. Almost by definition it is their fault. They have the opposite of insight.

Capitalism optimizes for anything that anyone is willing to pay for, and ability to pay is roughly proportional to utility provided to others. So it aligns with a sort of inegalitarian utilitarianism where a person's utility is weighted by how much utility they provide to others. This probably produces a much more dynamic, creative, progressing and interesting society than egalitarian utilitarianism which implies a repugnant conclusion of a googleplex barely-conscious useless parasites watching TV and eating potatoes.

AI will result in Baumol effects on steroids. For the few things that humans still have a comparative advantage at doing, their real wages will be much higher than before AI, because AI has drastically increased productivity in all other areas.

I don't think there are any truly universal "human values", only values that various subsets of humans have.

Capitalism optimizes for the profit returns of capital investors, which is a value of some humans.

AI-alignment people talk about whether a future superintelligence can be aligned with human values in general — with ideas like "friendliness", "coherent extrapolated volition", etc. I'm applying that notion to a different optimization process that we live with already.

@mongo None of the options mention "truly universal" values. I don't understand your comment.

Relatedly, why didn't you choose "Capitalism optimizes for some other subset of human values, and Goodharts the rest."?

Related questions