To which organization should I donate in 2023?
9
242
resolved Dec 20
18%
Rethink Priorities
27%
Longtermism Fund (Longview/GWWC)
1.3%
Center on Long-Term Risk
3%
EA Animal Welfare Fund
1.5%
CLR Fund
3%
EA Long-Term Future Fund
1.7%
Patient Philanthropy Fund
1.8%
Center for Reducing Suffering
3%
Center for the Governance of AI
2%
Sentience Institute
2%
The Good Food Institute
1.1%
Wild Animal Initiative
3%
Against Malaria Foundation
8%
GiveWell
1.3%
Happier Lives Institute
18%
Humane League

Inspired by this market by Aaron Bergman.

My moral views are:

  • Basically classical utilitarianism (balance of happiness over suffering is the ultimate good)

    • Sympathetic to tranquilism (absence of craving is the only intrinsic good, which puts contentment and pleasure on equal footing)

    • Sympathetic to a view where happiness and suffering both have value/disvalue, but the amount of happiness needed to outweigh suffering rapidly approaches infinity (e.g. Bergman 2022)

    • I know that logarithmic scales of pleasure and pain exist, but I'm not sure what to do with that information

    • Ideally, I'd like to give some weight to each of these views.

  • Future beings matter but I'm not sure how effective it is to try to improve the future

    • I accept that x-risk is a way to do that provided that the future is good on balance, but I'm not sure if the future will be net positive absent explicit intervention

    • I'm sympathetic to the thesis of "Existential risk pessimism and the time of perils", as well as David Thorstad's other criticisms of common longtermist views (e.g. here)

    • I'm absolutely not sure what the level of x-risk or s-risk is

  • Non-human beings matter (to the extent that they're sentient).

    • I'm confident that most vertebrates and many invertebrates are sentient but will defer to experts like Rethink Priorities on this.

    • I think digital minds could be sentient too, but am unsure whether they would be created in the future and how many there would be in a reasonable future.

      • This is a crucial consideration for me, so I think research into digital sentience is an important cause, but could be convinced that another cause would be a better use of additional dollars.

Some additional beliefs I hold:

  • I think diversity and inclusion in the EA community is important, so I am more inclined to support organizations and projects that make the EA community more inclusive, all other things being equal.

  • I think global priorities research is very important as it helps resolve crucial considerations like the plausbility of artificial sentience, but I could be convinced that other projects are a better use of additional money.

  • I believe liberal democracy and respect for human rights are very important for instrumental reasons as they lead to a happier world. I am open to them mattering intrinsically as well but I am confident that utilitarianism provides a sufficient reason to value them.

When nominating an organization, please explain why you think it would be good for me to support it according to my moral views. For example, if you recommend Rethink Priorities, you might want to explain why global priorities research is high priority. You can also argue against a view that I hold or for a particular view on an uncertainty I have (e.g. "the future will be net positive").

Ideally, the organization should be enrolled on Benevity so I can request an employer donation match. If you want to nominate an organization that is not a 501(c)(3), such as a political campaign, that's fine too.

This market will close about a week before Giving Tuesday (November 28), but I might extend it if that works better.

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ194
2Ṁ12
Sort by:

Given current funding realities, suffering focus plus moderate view on longtermism vs neartermism suggests animal welfare to me.

bought Ṁ10 of GiveWell

added anti-poverty charities. They seem pretty worthwhile considering your preference for respecting human rights, diversity and inclusion, and your scepticism about improving the future.

@B Thanks for the suggestions! While I care about human flourishing, I think charities focusing on global health and poverty for humans are unlikely to beat out animal-focused charities because animal suffering is larger in scale and seems more neglected:

  • According to the stats here, farmed land animals outnumber humans on Earth 3-4 to 1, and farmed fish outnumber humans 10-13 to 1. Wild animals are even more numerous: there are between 10^3 and 10^5 wild vertebrates for every human on Earth.

  • Animal welfare charities consistently pull in ~3x less EA funding than GHD ones (from eyeballing the chart here).

I might be convinced that GHD charities are more cost-effective than animal welfare ones if they have much more effective theories of change or the area looks 30x as tractable.