top of page

What is longtermism? Longtermism is an ideology that emerged from the so-called Effective Altruism movement over the past decade, and which claims that influencing the future—hundreds, thousands, millions, and even billions of years from now—is a key moral priority of our time, if not the key priority. The reason is that, as William MacAskill and Hilary Greaves argue, there could be vast numbers of people—perhaps 10^45—living in giant computer simulations running on planet-sized computers powered by Dyson spheres spread throughout the Milky Way galaxy, or beyond. Hence, if one wants to "do the most good," it would be better to focus on these possible future people—e.g., by making sure that they exist in the first place—rather than on, say, helping the ~1.3 billion people currently living in multidimensional poverty.

 

Problematic implications. Longtermism has many problematic implications, and could be extremely dangerous if those in power were to take it seriously (see Peter Singer's article here). Consider MacAskill’s argument that, from a longtermist perspective, it is important to stop burning fossil fuels as soon as possible. Why? The primary reason isn't climate justice, the suffering of those in the Global South, or the harms inflicted on ecosystems and other living beings, but because we should save some coal and oil to burn later on in order to rebuild Western industrial civilization were it to collapse, as Western industrial civilization looks to be a necessary stepping stone to colonizing space, converting planets into computers, and simulating huge numbers of "happy" digital people. Or consider Bostrom’s claim that minuscule reductions in "existential risk"—that is, any event that would prevent us from becoming a superior species of posthumans or colonizing space to simulate people—are morally equivalent to saving billions and billions of actual human lives. Or consider Nick Beckstead’s assertion that, since what matters more than anything is shaping the very far future (up to billions or years from now), we should prioritize saving the lives of people in rich countries over saving the lives of people in poor countries. These are very troubling assertions, and as readers will discover by exploring the articles on the Critiques page, they barely scratch the surface of how this ideology could cause serious harms in the world.

 

The enormous influence of longtermism. Yet longtermism is already hugely influential. It is shaping our world in profound ways right now, and will likely do so even more in the future, as longtermists run for public office, court tech billionaires, consult on or author major government reports, and possibly even promote their ideology through Hollywood-style action movies. Just consider, for example, that the richest man in the world, Elon Musk, calls longtermism "a close match for my philosophy." The longtermist Toby Ord has "advised the World Health Organization, the World Bank, the World Economic Forum, the US National Intelligence Council, the UK Prime Minister’s Office, Cabinet Office, and Government Office for Science." A recent report from the Secretary-General of the United Nations, which Ord contributed to, discusses "existential risks" and specifically references "long-termism." The crypto-billionaire Sam Bankman-Fried, a committed longtermist, funded the 2022 congressional campaign of a longtermist, Carrick Flynn, and says he could donate $1 billion to influence the outcome of the 2024 US presidential election. And the Effective Altruism movement itself has around $46.1 billion in committed funding. To quote Émile P. Torres, “longtermism might be one of the most influential ideologies that few people outside of elite universities and Silicon Valley have ever heard about.”

 

Understanding longtermism. We believe that it is imperative for people to understand what longtermism is and why it could be dangerous. It offers a deeply impoverished picture of the future. It holds that more technology will solve the problems created by technology, and sees Western industrial civilization as the pinnacle of human development. It ignores alternative views of the future—e.g., Indigenous views, and those built around queerness and disability—and it has roots in the quasi-religious ideology of transhumanism, which itself grew out of the Anglo-American eugenics program. (Indeed, Bostrom himself identifies “dysgenic pressures” as an existential risk no less than thermonuclear war and runaway climate change.) When MacAskill implicitly asks in his new book on longtermism, “What do we owe the future?,” one should ask in response: whose future is he talking about? The future of Indigenous peoples? Of non-Western cultures? Of Islam? Of the environment, ecosystems, and other living beings on this planet? There is a reason that powerful tech billionaires, such as Musk, and long-time defenders of “Western civilization” and race science, such as Sam Harris, are so keen on longtermism. As Torres writes:

 

The popularity of this religion [i.e., longtermism] among wealthy people in the West—especially the socioeconomic elite—makes sense because it tells them exactly what they want to hear: not only are you ethically excused from worrying too much about sub-existential threats like non-runaway climate change and global poverty, but you are actually a morally better person for focusing instead on more important things—risk that could permanently destroy “our potential” as a species of Earth-originating intelligent life.

 

Caring about the long term. It is precisely because we care about the long-term future of humanity, the environment, and our fellow creatures on Earth that we are so worried about "longtermism," which goes way beyond long-term thinking, something we believe is very important. We hope that this website provides a useful resource for those trying to better understand the growing spectre that haunts humanity—the spectre of longtermism.

Who we are. We are a growing community of researchers who are worried about the influence of longtermism. More to come.

bottom of page