A False Account of Transhumanism


Episode Artwork
1.0x
0% played 00:00 00:00
Nov 23 2024
"Ontological Bridge" by Lincoln Cannon

Physicist and neuroscientist Àlex Gómez-Marín has a beef with “the false religion of transhumanism.” Interesting title. It seems to suggest that he thinks there’s a true religion. I wonder what he thinks that is.

His subtitle claims that Transhumanism is an “AI death cult.” Heard that before, about every religion that’s ever been large enough to gain a critic. And of course this criticism of Transhumanism is far from new.

There’s some deep irony in the recurring claim that Transhumanism is a death cult. It usually comes from people who aspire to immaterial heavens or reject conceptions of heaven altogether. In contrast to their escapism and nihilism, Transhumanists aspire to persisting in a better world that’s as real as the light you’re using to read these words. Whether we call it “heaven” or not, it functions as a substantial affirmation of life.

Defining Transhumanism

Alex begins his criticism with an appeal to the risk of artificial intelligence. No disagreement there. Like all intelligence, including biological intelligence, AI is simply the goal-oriented application of power. That application of power, the goal of AI, can be for good or evil. And the goals of intelligence have little to do with their degree of intelligence.

In that context, Alex proposes a strawman definition of Transhumanism:

“Let us start with transhumanism, the movement that advocates for the ideological possibility (we wish), technical feasibility (we can), and moral imperative (we must) to tinker with the human condition in order to ‘enhance,’ so they say, our species, biologically and cognitively.”

All ideologies, including whatever motivates Alex, are “we wish.” None matters, including whatever Alex tries, unless “we can.” And only moral nihilists avoid “we must.” So the opening to his definition is just noise.

The second half is worse than noise. Transhumanists are merely tinkerers, he says, falsely implying that we’re indifferent to outcome. And he puts “enhance” in scare quotes, either falsely suggesting that our interest in enhancement in only nominal, or arrogantly implying that he knows how we should enhance ourselves better than we do.

Alex does briefly explore the meaning of “enhancement,” suggesting that Transhumanists actually and paradoxically pursue a diminishment or eradication of our capabilities. That aligns well with his article’s subtitle. But unfortunately for the article, it’s nothing more than a strawman of Transhumanism. And his silly insinuations are actually a lot like it would be for me to say that Alex only claims to be a physicist while actually and paradoxically functioning as an advocate of consuming humanity in a black hole.

Soul Copies

Alex says Transhumanists want to “copy life, edit humanity, and delete death.” This is actually better. We do essentially wish to pursue all of those goals. And in that, we’re not so different from the most powerful human ideologies presently and historically, including Christianity.

But Alex doesn’t like it. It’s an “ontological sleight of hand” in simulation and mimicry – a counterfeit. His tastes aside, we and our biological children are essentially modified copies of DNA that have repeatedly edited humanity. Would he characterize us as mere simulation and mimicry?

I doubt it. My guess is that Alex harbors the notion that humans have antinatural immaterial souls that are altogether different in kind from anything else in our world. Of course, there’s no evidence for that. And there’s abundant practical reason to suppose humans, both our bodies and minds, operate much like the world around us.

Singularitarian Obsolescence

Alex says that Transhumanists are pursuing the Technological Singularity. That’s careless of him. Some of us are Singularitarians. Some of us aren’t.

I’m not a Singularitarian because I consider the concept to be a failure scenario. If there’s ever a moment or period of time when humanity loses all ability to predict or control technological change, we’re almost certainly doomed. Most Transhumanists aren’t interested in human extinction. We aspire to remaining in control of our future, even while it’s enhanced with the powers of technological change.

Do we want to become more than human? Sure. We and most other humans want to become more than human. The most influential ideologies on Earth, presently and historically, all teach that humans have greater potential than we’ve yet realized.

Do we want to make humans obsolete? That depends on what you mean. If the better me makes the worse me obsolete then I’m in favor of that. But I certainly don’t wish to give up anything good about who I am already.

Alex is particularly concerned that we may wish to “extinguish our animal species into the machine.” Does he feel that way about our prehuman ancestors? Does he feel that way about our human ancestors from long ago who would no longer recognize what Earth has become? Humans have been cyborgs from the moment we harnessed fire, intentionally extending our natural machinery into artificial machinery.

“Is language an autocomplete process? Is thought simply problem-solving? What is intelligence, after all? Is creativity automatable? Is life mechanizable? Is consciousness digitizable? Is reality a simulation? Really!?”

Nope. Not really. That’s really just an extension of Alex’s strawman. Transhumanism is not remotely so shallow as he’d like his readers to believe.

Religious Inheritance

Alex says that historical proto-Transhumanists such as Dante Alighieri and Pierre Teilhard de Chardin weren’t Transhumanists in today’s sense. Of course that would be far too inconvenient for his criticism. He wants today’s sense of Transhumanism, in comparison, to consist of mere imbeciles. I wonder if Alex has been taking heat from anti-scientific religious zealots who characterize physicists like him as mere imbeciles for denying young-Earth creationism.

“Transhumanism offers a set of goods that are typically the province of religions,” continues Alex. He and I agree on that. Transhumanism does indeed often function as misrecognized religion. But of course he thinks that’s a bad thing.

I’m not sure exactly why. He seems careful to avoid making clear his own religious persuasion. But he’s certainly critical of Transhumanism’s audacity. With something akin if not functionally identical to religious zeal, we do indeed inspire to making our bodies and minds, our relations, and our world much better.

In that, we hardly differ from most of the rest of humanity. Almost everyone aspires to a better world. Almost all human ideologies, particularly religions, encourage this aspiration. Transhumanists just tend to be much more practical and less escapist than most.

But Alex can’t stand it. He’s sure we don’t care if “all hell breaks loose.” I wish he’d pause and think about how incoherent that would be. Why would we put so much concern and effort into enhancement aspirations if we don’t care that all hell breaks loose?

Technological Theology

Alex does some theology. Transhumanists are building a new God, he says. And it will replace “the Gods themselves” with a “new cult of digital totalitarianism.” He’s only partly right.

Although many Transhumanists shy away from characterizing our aspirations with such words, I and other Mormon Transhumanists certainly don’t. We embrace the ancient and enduring doctrine of theosis, long taught by Christians and even earlier religions. As Jesus teaches in the New Testament, we should become one with God, joint-heirs in that glory, partaking of the divine nature. We should, indeed, become new Gods.

What he gets wrong, however, are the notions of replacement and totalitarianism. We don’t become God by replacing God, at least not any God worthy of worship. That indeed would be a totalitarian aspiration like that which the New Testament attributes to the antichrist. In stark contrast, we become God by consoling, healing, and raising each other together in Godhood, as exemplified and invited by Jesus.

Now Alex may wish to marginalize my theological response, suggesting that it characterizes only Mormon and Christian Transhumanists. But he’d be making a mistake. Although most secular Transhumanists would reject the language I’m using here (and suppose themselves to be rejecting the ideas), they nonetheless generally show themselves to be embracing such ideas in function.

Despite exceptions, most secular Transhumanists oppose totalitarianism. And many if not most even have strong concerns with excessive authoritarianism. On the whole, we’re just as opposed to such potentialities as Alex feigns to be. If his pretentious indignation would permit, we’re mostly on the same side of this issue.

Ontological Bridges

But Alex seems too angry about the Transhumanist strawman that’s haunting his vision. Too angry, and too confused, he cannot make heads or tails of whatever he supposes Transhumanist metaphysics to be. Despite the vast diversity of perspectives among Transhumanists, he just wants us to know that “the gulf between body and soul, matter and mind, or brain and consciousness remains unbridgeable from the outset.” In other words, how dare we attempt to bridge that which his metaphysics has solemnly declared to be unbridgeable?!?

I have a simple proposal for Alex. Maybe it’s not Transhumanists who are responsible for befuddled metaphysics at the intersection of body and mind. Maybe it’s just a really challenging issue. And maybe even the most solemn neuroscientists have something to learn.

If my simple proposal is true then the feasibility of some Transhumanist aspirations is uncertain. For example, we don’t know for sure whether brain emulation will work. And that’s actually true. We don’t know.

Guess who else doesn’t know. Alex doesn’t. Probably no neuroscientist on Earth knows. We just don’t have an answer to this question yet.

But we do know some related matters. We know that biological brains work. That is, unless we’re solipsists, we’re confident that the dynamic growth and structure and function of brains correlate with conscious experience. So we do have a natural object, the brain, that strongly suggests proof of concept for dynamic mind-correlate machinery.

We also know that we’re more likely to discover or create that which we try to discover or create. If we decide, in advance, that something is impossible then we don’t try. And when we don’t try, we’re far less likely to see whether we’re right. So we have good practical reason to try, if we care about the possibility.

Technological Anthropology

Moving past metaphysics, Alex returns to theology. Or is it anthropology? He ends up mixing the two, which is ironic because he criticizes Transhumanists for doing just that. “Rather than aspiring to be one with their creator (if there is one), they dream of merging with their own creations (Promethean incest?),” says Alex.

Actually, however, we don’t have to choose. Of course the atheists of all stripes, Transhumanist or not, will disregard the possibility of becoming one with our creator. But as the New God Argument implies, we’re probably not going to end up merging with our intelligent creations unless we began with an intelligent creator.

Spiritual Procreation

Predictably, Alex brings up sex. Critics of Transhumanism frequently bring up sex, using sensationalism to prop up their vacuous rhetoric. He observes that some Transhumanists aspire to “literally having sex with their mechanically/digitally resurrected deceased loved ones.” How monstrous!

Or is it? What if humans, all of us, are biological machines living in a computed world? I’m not suggesting that we’re mere robots in a simulation. I’m suggesting that our best biological and physical theories may be consistent with the idea that human bodies are biological machines and our universe is a physics computer.

If that’s right then we’re already literally having sex with machined and computed loved ones. That’s how we make children, who inherit from us biological machine bodies that have evolved to function effectively in this physics computer. And if we trust in any real resurrection of the dead, as so many humans do and have done for millennia, then we’re essentially trusting in an eventual renewal of our machined and computed loved ones.

Now Alex was rather huffy about what he considered to be Transhumanist reductionism. And I’ll warn him and others, here, not now to engage in precisely what he was being huffy about. Don’t assume that machines and computed loved ones are merely simulated robots. Don’t be the reductionists against which you claim to be so opposed.

To the best of my ability to discern, my loved ones are spiritual computers. I am a spiritual computer. We’re not computers like the ones on our desks. But we’re still computers, in the broadest sense of the word.

Now maybe we have antinatural immaterial souls. I don’t think we do. And we have literally no evidence for such “things” – nor could we. But just for sake of argument, let’s suppose we do.

How does your antinatural immaterial soul get into your natural material body at birth? And why wouldn’t your answer apply to the placement of antinatural immaterial souls into natural material bodies at resurrection? Despite humans being intimately involved in the conception and birthing processes, and even despite extensive sexual promiscuity (which I oppose), it seems like humans keep getting souls at birth. Why should we expect anything different at resurrection?

Anti-Death Cult

Alex returns to the “death cult” charge:

“On the other hand, the movement is a kind of death-cult, articulating and promoting a self-immolating future for our species, for the supposed benefit of a post-human race that shall be better equipped, happier, and live forever here on planet earth and soon depart beyond the stars.”

I wonder if he’s as confused by this as I am. He says we’re a death cult because we want to be happier and live forever. And he says we want to live forever on Earth and soon depart to the stars. Which is it?

If, despite the incoherent articulation, he just means that we embrace evolutionary change then we’re guilty as charged. We embrace such change, and more. We aspire to injecting ourselves intentionally into that change. We hope to make it better than it would be if we were to remain disinterested.

This isn’t such a novel aspiration. Christianity calls this repentance. Repentance in change. And in Christianity the anticipated change of repentance goes far beyond just being nicer.

The apostle Paul describes with excitement an anticipated day of transfiguration. The dead will rise and the living will change from mortality to immortality. And despite some escapist and nihilistic interpretations among some Christians, the Bible characterizes that immortality as one of embodiment, of “flesh and bone” as the resurrected Jesus describes himself.

While most Transhumanists are not Christians, most yet aspire to an approximation of that same hope. We wish to live, not just metaphorically and not just meagerly. We wish to live literally, and beyond present notions of aging and death. And we plan to do something about that.

That plan also isn’t particularly novel. Jesus commands his disciples to the same goal. “Raise the dead,” he says. And Transhumanists, of the Christian variety and many others, take that seriously.

Greedy Utilitarianism

Alex goes on to blame Transhumanism for pursuit of excessive wealth. Of course no one should take him seriously. Interest in and pursuit of wealth occurs among most humans, regardless of our various perspectives on whether and how to apply technology to human enhancement. Someone somewhere probably thinks that people like Alex have excessive interest in wealth.

Is wealth evil? No. It’s just another form of power, a technology. And, like all technology, that power can be used for good or evil.

Alex also blames Transhumanism for the idea that a possible large number of future persons should have greater weight in moral calculations than the actual number of present persons. If he doesn’t like the idea, it must be Transhumanism! But I have news for him. Plenty of, maybe even most, Transhumanists have concerns with such calculations.

Count me among them. Actual persons should always matter more than possible persons. Actual risks should always matter more than possible risks. Concrete challenge should always be weighted more heavily than abstract challenge.

Long Term Altruism

Alex jumps the shark with his summary of this idea, which he attributes to Transhumanism. And his summary exemplifies just how bad, how poorly considered, are his characterization of and charges against Transhumanism. Here it is:

“Thus, their [extreme longtermism] concludes that the best thing to do is to use [effective altruism] to fund AI on steroids to push [Transhumanism], even if that entails killing our own species in the process (or ‘simply’ neglecting real problems like hunger, health, war, polarization and so on). Dystopia to achieve utopia is not just fine, it is the rational and right thing to do. That’s a deadly step for humanity, a giant leap for post-mankind.”

That would indeed be a death cult, as Alex claims. But he’s also describing a vanishingly small portion of Transhumanists. Most of us, in stark contrast, care a great deal about the lives that we and our family and friends are living right now. Most of us wish for these lives to be happier and healthier and longer, so that we may enjoy them together even more.

There are, indeed, many Transhumanists who advocate for longtermism. But there are multiple approaches to this idea. A few are anti-humanistic, meriting criticism like that offered by Alex. Most are compatible with a genuine humanism that would respond to the problems that we have already caused in the world, in the here and now, by being chronically short-sighted in the past.

There are also many Transhumanists who advocate for effective altruism. But, again, there are multiple approaches to this idea. A few use bizarrely-weighted calculations to arrive at disconcerting conclusions that merit criticism. Most are compatible with a genuine humanism that would respond to real problems in the way that we have traditionally allocated our time and resources.

Fortunately, albeit unfortunate for Alex’s credibility, most Transhumanists just aren’t what he wants his readers to believe us to be. Of course there are extremists and fools among us, as is the case among all ideologies with a significant number of adherents. But claiming, as Alex does, that we’re all extremist fools for the reasons he proposes is just naive or dishonest.

Redirection

Alex begins to conclude by suggesting that “the dilemma between utopia and dystopia is a trick of misdirection.” Indeed. He knows something about this, clearly, as he’s been engaging in misdirection throughout his entire article. Maybe his readers will believe that Transhumanists actually fit into the “one-dimensional thinking” or “two-alternative forced choice” characterization that he proposes.

I think, however, that most of his readers, perhaps with some assistance, will recognize that the aims of Transhumanism are consistent with holistic human thriving. His readers will realize that poetry and love and the soul would be more at risk if we were to give up the ancient and enduring quest for human enhancement. His readers will realize that Transhumanism is, at least in intention if not always perfectly in practice, synonymous with the betterment of humanity – coherent betterment entails change.

Then Alex concludes with a question:

“What will you tell your granddaughters when they ask you what you did when there was still something to be done about this mess we are in?”

I will tell my granddaughters (who I’m very much looking forward to meeting) what I’ve already been telling their parents for decades. With your whole soul, mind and body, trust in and change toward our shared potential for superhuman courage, compassion, and creation. This is true religion. More importantly, this is eternal life.