Within the framework of effective altruism, choosing one’s career is as important as choosing where to donate. EA defines professional “fit” by whether a candidate has a comparative advantage, such as superior intelligence or entrepreneurial drive, a spirit that encourages “hard work” or a lifelong commitment if an effective altruist qualifies for the high-paying path Build wealth so you can give it to EA careers. Bankman-Fried has stated that he is making money for giving, and even founded the cryptocurrency platform FTX with the express purpose of building wealth to redirect 99% of it. Now one of the richest cryptocurrency executives in the world, Bankman-Fried plans to donate up to $1 billion by the end of 2022.
“The beauty of effective altruism is that it’s a ready-made way to become a highly sophisticated, impact-focused, data-driven funder,” says David Callahan, founder and editor of Inside Philanthropy and author of a 2017 book on Philanthropy Trends, giver. EA not only presents a clear and decisive framework, but the community also provides a set of resources for potential EA funders – including GiveWell, a nonprofit that uses EA-driven evaluation criteria to recommend charities; EA Funds, which allows individuals to donate to curated charities; 80,000 Hours, a career coaching organization; and a vibrant discussion forum at Effectivealtruism.org, with leaders such as MacAskill and Ord regularly participating.
Effective altruism’s initial focus on measurement contributed rigor in a field historically lacking accountability to big donors to surnames like Rockefeller and Sackler. “It’s a belated, much-needed balance to the typical practice of elite philanthropy, which is very inefficient,” Callahan said.
But where exactly do effective altruists direct their income? Who benefits? As with all giving by EA or otherwise, there are no established rules for what constitutes “charity”, charitable organizations benefit from the tax code that incentivizes the super-rich to build and control their own charitable income at the expense of public tax, local governance or public responsibility. EA organizations are able to leverage the practices of traditional philanthropy while enjoying the light of effective and disruptive ways of giving. The movement has formalized its community’s giving pledge to donate what we can pledge — a reflection of another old-fashioned philanthropic practice — but has not publicly listed a giving requirement as a pledger. Tracking the full impact of EA’s ideas is tricky, but 80,000 Hours estimates that $46 billion goes to EA causes between 2015 and 2021, with donations growing about 20% annually. GiveWell calculates that it will spend more than $187 million on malaria nets and medicines in 2021 alone; this saves more than 36,000 lives, according to the organization’s calculations.
Accountability is much more difficult for long-term reasons such as biosecurity or “AI alignment” — a set of efforts aimed at ensuring that the power of AI is used for what is often considered a “good” goal. For a growing number of effective altruists, these reasons now take precedence over bed nets and vitamin A drugs. “The most important thing is the things that have a long-term impact on how the world looks,” Bankman-Fried said in an interview earlier this year. “There are trillions of people yet to be born.” Bankman-Fried The view is influenced by long-termist utilitarian calculations that flatten life into a single unit of value. By this mathematical calculation, the trillions of unborn humans represent a greater moral obligation than the billions alive today. Any threat that could prevent future generations from reaching their full potential—whether through extinction or technological stagnation, MacAskill argues is equally terrifying in his new book, Our debt to the future– First priority.
In his book, MacAskill discusses his own journey from longtime skeptic to true believer and urges others to follow the same path. The existential risks he presents are concrete: “The future could be bad, falling to authoritarians who use surveillance and AI to lock down their ideology, or even AI systems that seek to gain power rather than promote a prosperous society. Or there is no future at all: we could kill ourselves with biological weapons, or start a full-scale nuclear war that would cause civilization to collapse and never recover.”
Precisely to help guard against these exact possibilities Bankman-Fried created the FTX Future Fund this year as a Projects within his charitable foundation. Its areas of focus include “space governance”, “artificial intelligence” and “empowering exceptional people”.The fund’s website admits that its many bets “Will fail.” (Its main goal for 2022 is to test new funding models, but the fund’s website doesn’t identify what “success” might look like.) As of June 2022, the FTX Future Fund has made 262 grants and investments, including These include Brown University scholars who study long-term economic growth, Cornell University scholars who study artificial intelligence alliances, and organizations working on artificial intelligence and biosafety law (born out of Harvard Law School’s EA group).
Bankman-Fried isn’t the only tech billionaire pushing a secularist cause. Open Philanthropy, an EA charity funded primarily by Moskovitz and Tuna, has invested $260 million since its inception to address “the potential risks of advanced artificial intelligence.” The FTX Future Fund and Open Philanthropy together have provided more than $15 million in support to Longview Philanthropy this year, ahead of the group’s announcement of a new long-termist fund. Vitalik Buterin, one of the founders of the blockchain platform Ethereum, is MIRI’s second-largest recent donor with a mission to “ensure [that] Artificial intelligence that is smarter than humans has positive effects. MIRI’s list of donors also includes the Thiel Foundation; Ben Delo, co-founder of cryptocurrency exchange BitMEX; and Jaan Tallinn, one of Skype’s founding engineers, and co-founder of the Cambridge Centre for Existential Risk Research (CSER). Elon Musk is another tech tycoon committed to fighting the risks of long-term existentialism. He even claims his for-profit businesses — including SpaceX’s Mars mission — are philanthropy that supports human progress and survival. (MacAskill recently expressed raised concerns that his philosophy was conflated with Musk’s “worldview.” However, EA’s goal is to expand its audience, and it seems unreasonable to expect strict adherence to its creator’s exact belief system.)