Addicted to potential
Because I like to spend my leisure time making myself mad, I've been reading More Everything Forever by Adam Becker (affiliate link) and getting all het up about effective altruism and long termism.
The logic of effective altruism follows this path: you should do what is in your power to help others. Therefore you are morally obligated to find a high paying job so you have a lot of money with which to help others. However, it is incumbent upon you to ensure that maximum help is being achieved with your money; you want to help in the most effective way possible
This is obviously fucked up from the outset. Some people need expensive help. Some people need cheap help. It's not a waste to help the first group.
But effective altruism is worse than that. The effective altruists have done a whole lot of maths to figure out how best to spend their money and come to the conclusion that they should be concentrating on the hypothetical, space colonising people of the future.
There are more people living in this mythical future by several orders of magnitude, you see, so focussing your attention on helping them is much more effective than throwing your money away on helping the people who are alive and suffering right now.
Crucial to this idea, which very rich and powerful people are genuinely putting their money behind, is the belief that a) it will one day be possible to colonise space, that b) it is morally correct to do so no matter how much strip mining for resources it requires, and that c) there is no sacrifice humans living now can make that is not worth this future.
If you want more detail on how and why this future is scientifically improbable in the extreme More Everything Forever is a good, if infuriating read, but the short version is that it requires developing tech we don't know how to build, resources we do not have even if you plundered the whole planet, and the assumption that steamrolling our way through the universe is our right as humans.
But in exchange for all this you get to believe in a utopian future, in which aging is optional, peace is assured, and poverty is eradicated. Pick a problem you have on earth, the long termists and effective altruists will be able to tell you how they are solved in this future.
Part of this is because we will, naturally, develop an artificial general intelligence. This is expected to grow out of the current glut of large language models. It makes no sense, not least because no one can define what artificial general intelligence means, and large language models bear no resemblance at all to real human thought, but what really gets to me about this particular belief is that an AGI will solve all our problems.
This is literally the argument: it's ok for current AI data centres to be reversing our progress on climate change because when we have an AGI it will tell us how to actually solve it. An AGI will know how to end war, poverty, sickness – all of it!
This is obviously naïve. The reality is that we already know how to deal with so many of these problems; we may not have total solutions but we know how to profoundly mitigate their harms. The reason the problems persist is not that we have failed to think of the right solution, it's that there is no political will to put those solutions in place.
Those with wealth and power don't want to make sacrifices for the betterment of humanity – that won't change just because it's an AGI telling them what to do instead of scientists. Which, to me, exposes the real rot at the centre of this viewpoint: these are rich and powerful people who want to have their cake and eat it too. They want to be the heroes who save the human race but they don't want to have to give anything up to do it.
They are pouring money into the fantasy of an AGI led, space faring future because they want to convince themselves that that future is possible, that they can have everything they want without giving anything up, that if only they are able to build more data centres they'll find a way to live forever among the stars.
Literally, many of them believe that in this future they solve the problem of death.
It's at that point that it really comes home to me that this is a cult. A religion where the source of your faith is the ceaseless march of technological progress.
I've been thinking a lot about religion recently or rather, over the whole course of my life. Not in any way related to my own personal faith which is simply ever changing and yet constant like the ties, but as a sociological phenomenon. Why do we create gods for ourselves? Why do we feel this need to believe in something beyond our own understanding?
The modern era is a fascinating time for this question. So much of religious practice throughout history has been about confronting a lack of control. Prayer and sacrifice to a being of infinite power, to ensure a good harvest, to protect from illness.
But we're more sceptical now, and more scientific. We have more control over our lives than we've ever had before – although not as much as we like to think – and what we can't control we no longer attribute to divine intervention.
But we still look for something.
I think part of this is that we struggle to emotionally deal with the inevitability of our own squandered potential.
We are large, we contain multitudes and we feel all of them and yet our lives are so small. So short, so economically constrained, so limited by our fragile bodies and their tendency to break down over time.
There is so much we might do, so much we might feel, if we were not so limited in scope.
The difficulty we face in coping with the limitations of one human life – so much of religion is an answer to that. We crave a greater world beyond this one where we can fully realise all those parts of us that feel core to our identities but we will never explore or express.
You endure your feeble life on earth and your reward is the infinite in heaven.
It's not an uncommon theme in science fiction either, and so much of the effective altruist, long termist viewpoint is centred around science fiction.
Space colonisation, light speed travel, uploaded consciousness, sentient AIs, transhumanism, the singularity yes, but also the concept of unlocking the infinite.
It's intoxicating.
When I watch, for the severalth time, Everything Everywhere All At Once, the idea of not just knowing, but channelling all the possible versions of me that don't exist in this one and precious life I'm living is viscerally compelling. It feels like they're all right there, on the edge of my consciousness – the one where I'm physically powerful, the one where I'm a rock, the one where I'm married to Jamie Lee Curtis.
This is what I think it all adds up to. The whole thing is a failure to grasp that, whatever choices you make in life you are ruling others out. You carry the infinite within you but you are heartbreakingly finite.
So you invent a future in which you don't have to be.
Within a religious ethos, this is the result of personal sacrifice. You live with humility so in death you might be elevated to glory. It's also the result of your personal journey. You follow your faith and it's no one's business but yours.
That isn't true for the long termists. They're making this everyone's problem.
They're sacrificing the planet we have to push us towards ones we cannot survive on. They're exploiting us in the present so they can live forever in the future.
And because so many of them are billionaires (Elon Musk, Sam Friedman, Peter Thiel, Bill Gates, etc, etc) they are being taken seriously. They are pulling billions in investment money from the venture capitalists they've duped, they're persuading our governments to priorities their data centres over our environment.
We should all be so mad about this.
They are delusional. In a just world they'd be laughed out of every room they walked into.
They're pathetic, maladjusted losers who lack the emotional maturity to accept and relish in the imperfect wonder that we are.
What they don't grasp, what they're afraid to confront, is that death is a feature, not a bug. Limitations are crucial to humanity.
Not to view every problem through the lens of my own special interests, but anyone engaged in creative work knows this.
You can actually see it in the raft of directors who made a couple of fantastic low budget indie films, then got handed a billion dollar franchise film and wiffed it completely. Pushing against constraints is often what helps you achieve something great. It sparks real innovative thinking.
Endless possibility is stifling. It's overwhelming and when it's in front of us we tend to reject it. Because what really matters isn't our infinite unrealised potential – it's the myriad tiny joys of our short lives.
The point, for example, in Everything Everywhere All At Once is that, able to channel every possibly reality, Evelyn really just wants to hang out with her daughter.
Real wisdom and real joy comes from recognising this. It's agonising that so many powerful people are wilfully holding themselves back from an awareness of this.
You could almost feel sorry for them.
If they weren't fucking everything up for the rest of us.