Dave: [00:00:00] Most of the inventions
Stephen: of civilization are [00:00:02] really life extension technologies. You know,
Dave: [00:00:04] hunting and agriculture. I've done [00:00:06] enormous amounts of technology assisted [00:00:08] meditation, and I've had near [00:00:10] death experiences a couple times, and I [00:00:12] actually don't have any fear I can attack [00:00:14] the dying.
The ancient quest for [00:00:16] human immortality has collided [00:00:18] with ultra modern developments in [00:00:20] biology and ai, but [00:00:22] philosopher and writer, Stephen Cave, [00:00:24] whose research interests at [00:00:26] Cambridge University in Britain include [00:00:28] artificial intelligence, says life.[00:00:30]
Ancient technology is fraught with [00:00:32] ethical problems,
Stephen: but I think [00:00:34] accepting mortality is important [00:00:36] for living well. When you are [00:00:38] 179 and things are going [00:00:40] well, you might regret having signed up to [00:00:42] 180, but don't die. It's not gonna happen. [00:00:44] Even if we could all aging [00:00:46] and disease. Then we were to still be [00:00:48] susceptible to accidents.
Dave: You talk [00:00:50] about, or immortality [00:00:52] narratives, the big stories we [00:00:54] tell ourselves to escape [00:00:56] death. What are they and how do they shape [00:00:58] our society? [00:01:00] You are listening to the Human [00:01:02] Upgrade with Dave Asprey.[00:01:04] [00:01:06]
Steven, you've [00:01:08] studied life and [00:01:10] death. Would you [00:01:12] choose to live to 200 [00:01:14] years or more if you could? [00:01:16]
Stephen: Right now? I would [00:01:18] take a. Gamble on [00:01:20] 200, I think more [00:01:22] than that might be pushing it, but [00:01:24] uh, you know, I'm a quarter of the [00:01:26] way there and so far [00:01:28] so good.
Dave: Is that because it's all you think [00:01:30] is possible, or is this something about your [00:01:32] relationship with time and meaning [00:01:34] and sense of self or some deeper thing?[00:01:36]
Stephen: I think there are some profound [00:01:38] challenges that would come from [00:01:40] living a lot longer. [00:01:42] What we don't know is when they really [00:01:44] kick in, but there are challenges like [00:01:46] boredom and meaninglessness and [00:01:48] procrastination, and those are just the individual [00:01:50] challenges, not even to mention the challenges for [00:01:52] society.
Um, but [00:01:54] at the same time, life is precious. I'm [00:01:56] enjoying mine. That makes me very lucky. [00:01:58] And so I'll be [00:02:00] willing to gamble on 200. [00:02:02]
Dave: What would make you
Stephen: bored? [00:02:04] Yesterday I was standing in a [00:02:06] queue for most of the day to get a [00:02:08] visa. Um, I'm [00:02:10] lucky I have the kind of interests, you know, [00:02:12] intellectual interests, reading and writing and [00:02:14] so on that you can [00:02:16] fill, you know, many, many, many [00:02:18] centuries with.
And, [00:02:20] you know, I like being with people [00:02:22] and playing games and doing sport and so on. [00:02:24] So those are all things [00:02:26] that, um, I think I could enjoy [00:02:28] for a very long time if I have [00:02:30] access to them. Of course, uh, [00:02:32] you know, if you lock me up in a cell by [00:02:34] myself, then I would go mad very, very [00:02:36] quickly. So I [00:02:38] do have happy visions of living for a long [00:02:40] time, but they do require [00:02:42] access to the right kind of resources and people.[00:02:44]
Dave: Are you afraid of going mad? [00:02:46]
Stephen: Yes, I think I probably [00:02:48] am a bit, [00:02:50] I mean, [00:02:52] I don't feel like it's imminent. [00:02:54] Um, but under certain [00:02:56] circumstances, um, I [00:02:58] think, you know, we as [00:03:00] humans, our, our mental states are pretty fragile [00:03:02] out. Um,
Dave: mine too, [00:03:04] if you were to go mad, would you [00:03:06] know.
Stephen: I think so. [00:03:08] Um, yeah, I don't think where most [00:03:10] people, I mean mad, you know, mad, it's a very [00:03:12] vague term.
Yeah. You know, most people, [00:03:14] if they go mad, it doesn't mean they suddenly think [00:03:16] they're Napoleon or, you know, or a [00:03:18] cow or something, you know, and [00:03:20] mental health, uh, uh, uh, you know, [00:03:22] being mentally well is about [00:03:24] not being depressed and having a sense of [00:03:26] meaning and purpose. And I think, you know, [00:03:28] lacking those things is something we feel [00:03:30] as a real pain or, uh, [00:03:32] a real lack.
Dave: [00:03:34] I have this beautiful book from the [00:03:36] 1950s called The Three Christs of [00:03:38] Yes, plenty. And it's about [00:03:40] a psychiatrist who took three [00:03:42] people who believed they were Jesus [00:03:44] and put them in a room together and [00:03:46] had them hang out for a few weeks [00:03:48] and watched what they did. What [00:03:50] would you predict happened?[00:03:52] [00:03:54]
Stephen: Um. [00:03:56] I would predict [00:03:58] that they found each [00:04:00] individually, that their views were not [00:04:02] challenged by the presence of the other [00:04:04] Jesus', but they kept [00:04:06] believing individually that they were Jesus in the [00:04:08] face of evidence. To the contrary,
Dave: it's [00:04:10] true. They actually all worked out a [00:04:12] story where they could all be Jesus, so that [00:04:14] none of them would have to pierce the other [00:04:16] ones.
Uh. Belief [00:04:18] system, which is remarkable. And it [00:04:20] shows a lot about [00:04:22] human beliefs. They're a little bit more [00:04:24] malleable than we might think. Right. [00:04:26] Is the quest for immortality [00:04:28] about enlightenment, or is it [00:04:30] just hidden ego? [00:04:32]
Stephen: I, I'm inclined to think [00:04:34] more the latter. Um, [00:04:36] of course there are versions of the story that [00:04:38] focus on enlightenment, [00:04:40] um, but it's enlightenment that precondition for [00:04:42] immortality, a certain kind of [00:04:44] reward.
Uh, is it [00:04:46] necessary? I. For immortality. Do you need to [00:04:48] be enlightened in order to [00:04:50] avoid boredom and meaninglessness and the [00:04:52] problems of procrastination? Do you know? [00:04:54] I, I, I, so there's a story to tell [00:04:56] around that, about how only if we give [00:04:58] up worldly desires and [00:05:00] their transience, could we actually have a [00:05:02] happier immortality.
But I think [00:05:04] most people who want to live forever, either they [00:05:06] want more of the good times that they've [00:05:08] got now, which you know, and why not, [00:05:10] or they're afraid of death. [00:05:12]
Dave: The Buddhist teachers I've studied [00:05:14] with universally teach that [00:05:16] boredom and procrastination are simply [00:05:18] manifestations of the ego. Do you agree?[00:05:20]
Stephen: Well, certainly there [00:05:22] are practices that you can find in a [00:05:24] number of wisdom traditions, if you [00:05:26] like, like Buddhism and [00:05:28] stoicism I think is very similar. You know, coming from a [00:05:30] different part of the world that. [00:05:32] Teach you both to [00:05:34] be less focused on [00:05:36] yourself and your own ego, [00:05:38] and thereby to cope better [00:05:40] with challenges [00:05:42] like boredom and frustration and [00:05:44] anger and all of the other kind of [00:05:46] tribulations of, um, [00:05:48] everyday life.
So certainly [00:05:50] I think if we, if we desired [00:05:52] less, if we cared less [00:05:54] about, you know, what we have [00:05:56] and our reputation and such, like [00:05:58] focused more on inner [00:06:00] peace. Then we would be able [00:06:02] to withstand, um, even [00:06:04] queuing for a visa for a lot [00:06:06] longer.
Dave: Sometimes I hope that with [00:06:08] the power of compound interest, [00:06:10] uh, that at a certain point you can hire [00:06:12] someone to wait in line for you.[00:06:14]
Stephen: Yeah, I know. [00:06:16] One day. Yeah. What [00:06:18] if
Dave: death is [00:06:20] actually the thing that makes [00:06:22] civilization improve?
Stephen: Yeah. [00:06:24] Well death [00:06:26] plays an important role. I mean, it [00:06:28] doesn't, you know, we focus on life, [00:06:30] what the living do [00:06:32] their, their deeds. Okay. We [00:06:34] might see that sometimes people are [00:06:36] motivated by, [00:06:38] um, their legacy [00:06:40] or by devising [00:06:42] ways of staying alive longer.
I mean, I [00:06:44] think of, you know, most of the inventions of [00:06:46] civilization are really. Life extension [00:06:48] technologies, you know, hunting and [00:06:50] agriculture, like they give us food, [00:06:52] um, and food storage systems, [00:06:54] walls to keep out attackers and [00:06:56] uh, and so on. So, you know, and not [00:06:58] to mention medicine and science and technology.[00:07:00]
So, um, there's the [00:07:02] desire to postpone death as much as [00:07:04] possible is certainly motivating [00:07:06] civilization. Um, but [00:07:08] at the same time, an awareness of [00:07:10] death and the fact that time is limited [00:07:12] also drives people [00:07:14] to make the most of life and do great [00:07:16] things.
Dave: Your take [00:07:18] that pretty much all [00:07:20] technology is life extension.
[00:07:22] It makes me just profoundly [00:07:24] happy. Uh, I agree with you by the way. [00:07:26] The question though [00:07:28] is why do people have no [00:07:30] problem with taking a car instead of [00:07:32] walking? I. But if you [00:07:34] talk about extending life, a lot [00:07:36] of people get really triggered by that. So you [00:07:38] can't do that. And even some people who [00:07:40] call themselves longevity doctors and say, it's [00:07:42] impossible to extend life, but I'm a longevity doctor, and [00:07:44] like what is the, [00:07:46] what is the human resistance to [00:07:48] life
Stephen: extension?[00:07:50]
I think we have a strong sense of what [00:07:52] is natural. Now, [00:07:54] ironically, our sense of what is [00:07:56] natural is very socially conditioned. [00:07:58] But, uh, all [00:08:00] societies have a strong conception of [00:08:02] life stages [00:08:04] and a lot of [00:08:06] meaning and [00:08:08] rituals. Um, place and [00:08:10] purpose and life connections are [00:08:12] based on those life stages. So, you [00:08:14] know, um, you go to school, you get [00:08:16] your education.
Uh, you, [00:08:18] you, you start work, you [00:08:20] become a family man or woman. [00:08:22] Um, at some point you retire and [00:08:24] so on. And you know, there's a path [00:08:26] life is expected to take [00:08:28] and we build meaning [00:08:30] around that. Um, [00:08:32] meaning that makes the difficult bits of that, [00:08:34] okay? Like the kind of degeneration that [00:08:36] comes with age and, you know, lots [00:08:38] of traditions value the wisdom that comes with [00:08:40] age or see it as a spiritual [00:08:42] time, a physical decline, but spiritual [00:08:44] growth and so on.
So all of these stories help [00:08:46] us to. Cope with the [00:08:48] reality of, uh, life, [00:08:50] uh, uh, the shape of life as it is [00:08:52] now. And so it becomes very [00:08:54] challenging, I think, for people to [00:08:56] imagine throwing all that out and, [00:08:58] you know, certain people [00:09:00] doing something completely different and [00:09:02] instead they may be living much longer.
Dave: The [00:09:04] Ericksonian stages of [00:09:06] adult development are, are pretty well [00:09:08] defined with the stages of life, and [00:09:10] they appear to be relatively constant [00:09:12] across cultures. So there's something that [00:09:14] happens with normal aging. [00:09:16] But even in the last [00:09:18] 30 years, we've extended the average [00:09:20] lifespan in the US by six and a half [00:09:22] years, which is kind of funny when people say you [00:09:24] can't do it like we already did.
Do [00:09:26] you think that each of the stages [00:09:28] gets lengthened? I. Or [00:09:30] do we go through the stages and just add [00:09:32] more to the last one? When we're our [00:09:34] wise elders,
Stephen: we [00:09:36] definitely have seen lengthening of, uh, the [00:09:38] first stage of education, if you like, of [00:09:40] childhood. Older people complain [00:09:42] that, you know, at the age of [00:09:44] 30 kids these days still seem to be [00:09:46] children.
Um, but of [00:09:48] course education, I mean, you know, come [00:09:50] education as, as in the school system [00:09:52] is a fairly recent [00:09:54] invention and the age at which people [00:09:56] are expected to stay in school has increased [00:09:58] from like. You know, [00:10:00] single digits to, to teens, [00:10:02] and now many people gone to college and [00:10:04] uh, and, and so on. So we are extending the [00:10:06] period of education.
Um, [00:10:08] and, uh, the period, I mean, [00:10:10] work is more complicated because we invented [00:10:12] this thing of retirement and having [00:10:14] pensions and so on that allows people to think [00:10:16] about the latter stages of life [00:10:18] very differently. So [00:10:20] working life hasn't increased as [00:10:22] much as it might have proportionally, [00:10:24] but, um, but fortunately a lot of people [00:10:26] are staying fitter.
[00:10:28] For longer. So it is not [00:10:30] just that the period of [00:10:32] decrepitude, if you like, is, is getting [00:10:34] longer, right?
Dave: You talk about [00:10:36] four immortality [00:10:38] narratives. The big stories we [00:10:40] tell ourselves to escape [00:10:42] death. What are they and how do [00:10:44] they shape our society?
Stephen: I. [00:10:46] So yes, I think across cultures, [00:10:48] across history, you can see that every [00:10:50] society has told itself [00:10:52] immortality stories, like ways in [00:10:54] which we can deal with death.
And there are [00:10:56] some very clear themes, I think [00:10:58] in, in and, and [00:11:00] basically four kinds of, [00:11:02] um, immortality story. [00:11:04] The first one is the most straightforward [00:11:06] is staying alive in this body, [00:11:08] on this world, you know, [00:11:10] life as we know it, but [00:11:12] extended and, you know, it might [00:11:14] sound. Implausible in [00:11:16] the face of inevitable [00:11:18] decline and aging and and [00:11:20] so on, and yet.[00:11:22]
Almost every culture has some [00:11:24] story of an elixir of life or [00:11:26] something like that that can keep [00:11:28] people going indefinitely. You know, [00:11:30] the oldest work of literature that we have, the Epic [00:11:32] of Gilgamesh is many, many, many thousands [00:11:34] of years old. Um, was already [00:11:36] thousands of years old when the Egyptians were, [00:11:38] you know, building their pyramids is about [00:11:40] an elixir of life and, and so on.
And [00:11:42] today here we are, you know, hoping that science and [00:11:44] technology will provide one for us. So [00:11:46] this is a, a very widespread story. [00:11:48] But of course everyone knows it's [00:11:50] very unreliable. You know, lots of people have [00:11:52] pursued an elixir of life. [00:11:54] What they all have in common is they're all six foot [00:11:56] under pushing up daisies right now.
You know, no [00:11:58] one's found it. So, [00:12:00] um, we need backup plans [00:12:02] and I. The most [00:12:04] obvious. So the second [00:12:06] main kind of immortality story is [00:12:08] resurrection. So accepting you have to [00:12:10] physically die, but hoping you [00:12:12] can physically come back to life. [00:12:14] And again, that might sound implausible. We don't [00:12:16] tend to see humans just like springing [00:12:18] up from the earth, but we do [00:12:20] see natural cycles of death.[00:12:22]
And rebirth and life. Of [00:12:24] course, you know, if you live in a [00:12:26] northern hemisphere like I do, you see very [00:12:28] strong seasonal changes [00:12:30] and in, in winter everything dies back and [00:12:32] spring right now is Easter this weekend [00:12:34] for me here. Um, as we're recording, [00:12:36] um. Things burst back into [00:12:38] life. And Easter is of course a [00:12:40] literal, uh, celebration if you're a [00:12:42] Christian of coming back to [00:12:44] life, of physical resurrection.
And actually [00:12:46] it's part of, uh, [00:12:48] Orthodox Judaism, [00:12:50] Christianity, and Islam that we are [00:12:52] physically resurrected. So it's still a [00:12:54] common story. At the same [00:12:56] time, it has profound, um, [00:12:58] philosophical challenges. Um, what [00:13:00] makes this sort of recreated person that [00:13:02] reemerges the same one as the [00:13:04] one who died? So, you know, in [00:13:06] early Christianity, the, when it was, um, I.[00:13:08]
Before it became the official [00:13:10] religion of the Roman Empire. And Romans would [00:13:12] torment the Christians and they would [00:13:14] torment them, especially around this belief around [00:13:16] resurrection. And they would like throw [00:13:18] Christians to be eaten by lions and then [00:13:20] burn the remains and eat the [00:13:22] lions and, uh, [00:13:24] scatter anything that was left on the river [00:13:26] and say, resurrect that.[00:13:28]
Right. And, and [00:13:30] dark, you know, you can say, oh, well God can do [00:13:32] anything, but, but what makes that [00:13:34] recreated entity? [00:13:36] Um, uh, the [00:13:38] same, yeah, the same one. Um, [00:13:40] so, uh, a lot of people [00:13:42] think, well, okay, bodies are just too unreliable, which isn't [00:13:44] gonna work. What we need is something [00:13:46] immaterial, something non, [00:13:48] uh, bodily that isn't [00:13:50] subject to aging and disease, and, [00:13:52] and that's the soul.
And [00:13:54] it's also a very ancient belief. So this is the [00:13:56] third kind of immortality story. It's a very ancient [00:13:58] belief and it also [00:14:00] intuitive, but in a different way. But we, you know, [00:14:02] we find it easy to imagine that [00:14:04] we can, you know, close our eyes and [00:14:06] drift away from our body. We seems to do it [00:14:08] in dreams and mystical [00:14:10] experiences.
So it's a very common view [00:14:12] around the world and, and this [00:14:14] view as an immortality story says. [00:14:16] Soul is indestructible. It's [00:14:18] not material. And so it's not [00:14:20] subject to the kind of decay and [00:14:22] dissolution of material things. [00:14:24] Um, so it can, it can live on indefinitely. Course [00:14:26] we, so that view is incorporated into [00:14:28] Christianity and Islam [00:14:30] complimenting the resurrection story, but can also be [00:14:32] found in Hinduism and certain, you know, [00:14:34] to an extent in Buddhism.
And it's a bit more [00:14:36] mm-hmm. Complicated. Um, [00:14:38] anyway, but you know, the soul [00:14:40] also has its problems, right? The, the [00:14:42] evidence of science is that. [00:14:44] What's important to us in terms of our mind, our [00:14:46] personality, our consciousness isn't [00:14:48] immaterial, but actually is [00:14:50] dependent upon something very material in our [00:14:52] brain.
And so. Uh, [00:14:54] again, people seek an alternative [00:14:56] kind of immortality story. And the [00:14:58] fourth kind of immortality story is [00:15:00] surviving through some kind of legacy. [00:15:02] So that might be a cultural [00:15:04] legacy, like, you know, a great work [00:15:06] or, or, you know, Achilles, uh, [00:15:08] in, uh, the Iliad. He stands at the beach [00:15:10] of Troy and has to decide [00:15:12] whether to go home and live a long life is.[00:15:14]
A minor king of a nice [00:15:16] little kingdom or to [00:15:18] die in Troy and is prophesied by, by his [00:15:20] mom, who happens to be a goddess that [00:15:22] he'll, uh, if he stays at Troy, he'll die, but he'll [00:15:24] become the most famous warrior ever to [00:15:26] have lived. And, and he chooses that. So, [00:15:28] or we might, of course there are other kinds of legacy, biological [00:15:30] legacy of course, of living on through our children [00:15:32] and, uh, and
Dave: so the Ginga Khan [00:15:34] method, or maybe the Elon Method these days.
[00:15:36] Yeah.
Stephen: Okay. Yeah, exactly. So those are [00:15:38] the four fundamental stories that you [00:15:40] can find all around the world. Which [00:15:42] one do you ascribe to? [00:15:44] Well, [00:15:46] uh, I guess as a philosopher, I'm skeptical [00:15:48] about resurrection. I'm skeptical about [00:15:50] the soul. I want [00:15:52] to stick around in this [00:15:54] body. Well, as we mentioned, a couple hundred [00:15:56] years would be good.
At the same time, I don't [00:15:58] think I'm gonna do so forever. You know, [00:16:00] even if all of our medical [00:16:02] interventions work, at some point the earth's just gonna get [00:16:04] eaten up by the sun. [00:16:06] So, um, stick around for [00:16:08] as long as we can. But also, I [00:16:10] guess I, you know, I am a [00:16:12] writer. I, I enjoy writing books. I [00:16:14] enjoy having an impact on the world [00:16:16] that I think will outlast me, [00:16:18] and I hope my children will live.
It's very [00:16:20] meaningful to me too. So, so I guess [00:16:22] I'm interested in one and four, [00:16:24] but at the same time, I don't wanna be [00:16:26] sucked too much into them. I, I, I, [00:16:28] I think that we do also need to face [00:16:30] the reality of death.[00:16:32] [00:16:34] [00:16:36]
Dave: I've spent a lot of time thinking [00:16:38] about this and studied many different [00:16:40] lineages and, and direct [00:16:42] experience of things, and I [00:16:44] decided that the rational half of [00:16:46] my, uh, being [00:16:48] can believe one thing, [00:16:50] um, because it makes sense [00:16:52] and it matches what the [00:16:54] mystical half of me also [00:16:56] experiences. And the rational [00:16:58] side says that the only.[00:17:00]
Um, worldview [00:17:02] that makes sense to believe in [00:17:04] is, uh, reincarnation. [00:17:06] And the reason for [00:17:08] that is that if I'm wrong, I won't know, [00:17:10] but believing in it makes me less [00:17:12] fearful and therefore I have more [00:17:14] joy and a better experience in the life that I have. [00:17:16] If I'm right, great. Then I have [00:17:18] some kind of attainment and all that sort of stuff.[00:17:20]
And the mystical side of me that has had [00:17:22] direct experience of things I cannot [00:17:24] explain in any other way. [00:17:26] Than past lives and teachers who see them [00:17:28] and other people where when [00:17:30] people feel safe to talk about it, [00:17:32] even people who have, you know, a faith where [00:17:34] it's not supposed to be possible to do that, they're saying, well, yeah, [00:17:36] I kind of experienced that, but I don't know what to do in [00:17:38] the context of the beliefs I've been taught.[00:17:40]
So I'm like, I'm just gonna pick that [00:17:42] one and I think [00:17:44] I'll keep my body alive to at least [00:17:46] 180. Reserving the [00:17:48] right to die at a time and by [00:17:50] a method of my choosing. And it's [00:17:52] very different than the, you know, the [00:17:54] dying is the worst thing ever, [00:17:56] never die. I, that doesn't [00:17:58] feel right to me. Why [00:18:00] is the difference between never [00:18:02] dying and living as long as you [00:18:04] choose?
What do they feel so different? [00:18:06] Hmm.
Stephen: Yeah. [00:18:08] Never dying. Well, [00:18:10] as a belief for two aspects. [00:18:12] One, I think is it, it is [00:18:14] unrealistic, right? It's the, the [00:18:16] idea, like the don't die movement. I mean, you [00:18:18] know. Yeah. Brian's a friend. [00:18:20] I've written new
Dave: book on longevity and [00:18:22] I'm, I'm the 180 guy and he's the, you know, [00:18:24] don't do it at any cost.
Right, [00:18:26] exactly.
Stephen: So, you know, I'm a big fan of Brian too, [00:18:28] and I love what he's trying to do and, [00:18:30] you know, promoting healthy living and [00:18:32] the work he's doing for, um, the, [00:18:34] expanding our ideas of [00:18:36] what's possible. But, um, but [00:18:38] don't die. It's not gonna happen. Um, [00:18:40] though, you know, even if we could all I. [00:18:42] Aging and [00:18:44] disease, then we were to still be susceptible [00:18:46] to accidents.
Um, so, [00:18:48] you know, one research has predicted that in the [00:18:50] US you'd lived to about 5,000, [00:18:52] um, before it provides an a obviously [00:18:54] about sort of an average. And some people [00:18:56] would, you know, get hit by a [00:18:58] bus at the age of 10 and some people lived to [00:19:00] 10,000. But, but, um, that [00:19:02] even that assume civilization will [00:19:04] continue into something like its current form, which is [00:19:06] incredibly unlikely.
That's not [00:19:08] what the lesson of history teaches us. And then of [00:19:10] course, you know, we've got the whole. Death [00:19:12] of the solar system and then of the university. [00:19:14] So. So [00:19:16] don't I, you know, it's a, [00:19:18] it's, it's a powerful slogan, [00:19:20] but it isn't realistic. Well, does [00:19:22] that matter? I think in a [00:19:24] way it does, because I think accepting [00:19:26] mortality is important for [00:19:28] living well.
So, and that's [00:19:30] why I think there's a difference between [00:19:32] the 180 and the. [00:19:34] No, I don't wanna die ever. [00:19:36] I mean, you know, when Dave, [00:19:38] you are 179 and things are going [00:19:40] well, you might regret having signed [00:19:42] up to 180. [00:19:44] Um, but still it suggests that, [00:19:46] you know, fundamentally, [00:19:48] you accept at some point you have to move [00:19:50] on.
It's inevitable whether over a time [00:19:52] of your choosing or not is gonna happen. [00:19:54] And maybe that's also a passing on [00:19:56] of the baton to, you know, another [00:19:58] generation. So on. [00:20:00]
Dave: It's funny, I always say at least 180 [00:20:02] 8 to reserve the right to extend it, [00:20:04] just in case I decide that that would be more [00:20:06] fun. But. [00:20:08] Ultimately, as it is, I've done [00:20:10] enormous amounts of technology [00:20:12] assisted meditation, and [00:20:14] I've had near death experiences a couple times, [00:20:16] and I actually don't have any [00:20:18] fear I can detect of dying.
I'd rather [00:20:20] not experience pain. I'd rather not die [00:20:22] now. But I've gotten [00:20:24] to that point, which is unusual [00:20:26] and maybe pathological, I don't really [00:20:28] know. Uh, but I'm, I'm immensely [00:20:30] curious about it. And I've come to look at [00:20:32] death and birth as [00:20:34] things that always happen. So. [00:20:36] You know, do do what you can, [00:20:38] but just not to worry.
So I don't spend [00:20:40] energy on it. And along the [00:20:42] way. The, the New [00:20:44] York Times once wrote this [00:20:46] beautiful complimentary article called [00:20:48] Inside the Cult of Bulletproof [00:20:50] Coffee. Again, said, you know, I'm a positive [00:20:52] cult leader and I don't run [00:20:54] bulletproof anymore. But, um, it was [00:20:56] part of the, the formation of the [00:20:58] biohacking movement.
And more [00:21:00] recently, uh, Brian Johnson said, my [00:21:02] only competition is Jesus. [00:21:04] So is it [00:21:06] possible that biohackers are [00:21:08] today's new high priests? [00:21:10]
Stephen: Well, they are prophets [00:21:12] of a, of a movement, of a certain [00:21:14] kind. Absolutely. And they're inspiring a [00:21:16] lot of people. [00:21:18] Um, and I think [00:21:20] like any high [00:21:22] priest, we [00:21:24] should be asking ourselves, [00:21:26] um, are they inspiring them for the [00:21:28] good for, for, for, for the good of those [00:21:30] people.
And I think, you know, obviously in any [00:21:32] kind of movement there can be charlatans, but probably [00:21:34] most high priests in most religions right [00:21:36] now believe they're good people doing the right [00:21:38] thing. And I think that's true in the. [00:21:40] Biohacking and, and longevity market. Of [00:21:42] course, you know, there's a lot of cynicism and [00:21:44] people like Brian and [00:21:46] others get a lot of criticism, and I think it's [00:21:48] unfair.
Uh,
Dave: you know, I think lots of [00:21:50] trolls that Lots of trolls. Exactly. And, you [00:21:52] know,
Stephen: um, and I, and [00:21:54] I think, and I think it's unjustified [00:21:56] because these are people who are doing [00:21:58] what they think is. Right [00:22:00] in advocating for [00:22:02] longevity, but uh, like any high priest, we [00:22:04] should interrogate, is that the right thing for society?[00:22:06]
And I think the answer is, um, partly [00:22:08] yes, that, you know, it's great [00:22:10] to show people [00:22:12] ways of living healthier [00:22:14] and longer and more [00:22:16] optimistically and, you know, there are [00:22:18] also challenges that we'll need to [00:22:20] overcome to make that go well.
Dave: Uh, [00:22:22] that was very a, a British lee. [00:22:24] Politely put. [00:22:26] Thank you. [00:22:28] Are we rejecting [00:22:30] nature when we try to transcend the [00:22:32] body?
Stephen: I don't worry [00:22:34] about that. No, I don't think, I mean, nature's not a [00:22:36] fixed thing, right? Nature's a process of, [00:22:38] uh, constant [00:22:40] change and, um. [00:22:42] When we [00:22:44] are part of nature, we're products of nature. [00:22:46] We need to, we need to, you know, [00:22:48] accept that, um, and [00:22:50] work with it. And, you know, any [00:22:52] biohacking will be [00:22:54] accepting that and working with it in, in, in its [00:22:56] own way.
So of, of [00:22:58] all of the complaints and criticisms that might [00:23:00] be leveled, right? That's not one that would keep me [00:23:02] awake at night.
Dave: It always makes me [00:23:04] laugh. It's profoundly egoic to think [00:23:06] that what humans create isn't a part of nature. Yeah. We [00:23:08] just do a bad job of creating it. [00:23:10] Right. Like [00:23:12] hefty trash bags are a part of [00:23:14] nature.
They're just one that [00:23:16] is formed to not break down and it doesn't [00:23:18] help the rest of the system. But it was [00:23:20] made by meat, uh, just [00:23:22] like, you know, bees make wax or [00:23:24] whatever. So I think it's an artificial distinction [00:23:26] because somehow humans are not a part of [00:23:28] nature and whatever. Yeah. Do you [00:23:30] think immortality is.
Just a
Stephen: [00:23:32] status game. Mm-hmm. [00:23:34] It, it doesn't need to be, I [00:23:36] think, um, certain immortality [00:23:38] systems are though, we mentioned [00:23:40] Achilles already. He wanted to [00:23:42] be the greatest warrior who had ever [00:23:44] lived and so have statues built of [00:23:46] him. Well, we can't all be the [00:23:48] greatest warrior who's ever lived. [00:23:50] So immortality in [00:23:52] Homeric, uh, Greece [00:23:54] was very much a, a status game.[00:23:56]
And you know, there are parallels with celebrity [00:23:58] culture today. It's [00:24:00] different, you know, you don't need to be the greatest warrior he's ever [00:24:02] lived. You just need to have a cute kitten [00:24:04] or, you know, something, [00:24:06] something similar. [00:24:08] Um, so, but you [00:24:10] know, if you're, if you are competing [00:24:12] for likes or, or whatever it might be to, [00:24:14] to, to give you that sense of [00:24:16] immortal fame, then that's [00:24:18] a, a status game.
But then when [00:24:20] Christianity came along. [00:24:22] Christianity came in order to [00:24:24] and flourished at a time when these [00:24:26] Greco-Roman views around [00:24:28] heroism and the cult of the individual [00:24:30] of, you know, the, the ruler as [00:24:32] God of the great hero and so on, [00:24:34] and, and, and, and Christianity [00:24:36] came along and said, no, no. Even the [00:24:38] meek can be.
Uh, uh, [00:24:40] immortal, even, even the slave and the [00:24:42] downtrodden can live forever. So [00:24:44] that was very much reversing [00:24:46] the status game and saying, no, no, immortality [00:24:48] could be for everyone. So I think it really varies [00:24:50] depending on the stories being [00:24:52] told.
Dave: If you could upload [00:24:54] yourself to the internet, would it still [00:24:56] be you?
Stephen: No. [00:24:58] I [00:25:00] think we can be clear about that. I think, [00:25:02] I mean, you know, I love the [00:25:04] stories that are told that a, around [00:25:06] this, that are very compelling. It's very easy to [00:25:08] tell a story that makes this sound [00:25:10] convincing. Right? Sort of, you know, Dave, I'm [00:25:12] gonna put you on the slab and I'm gonna make you [00:25:14] immortal by replacing every one of your [00:25:16] neurons one at a time with some, you know, [00:25:18] digital silicon equivalent.
I kind of wanna do
Dave: that. [00:25:20]
Stephen: Right. And you know, something that [00:25:22] makes it, and I promise, at no point will you [00:25:24] lose consciousness and, you know, 24 [00:25:26] hours from now you'll be this digital [00:25:28] being and you can roam the internet. And [00:25:30] it sort of sounds like a plausible story until [00:25:32] you start poking. But I mean, the [00:25:34] main, the clearest reputation of, [00:25:36] of, um, the [00:25:38] idea that uploading preserves [00:25:40] identity, personal identity [00:25:42] is duplication.[00:25:44]
That if, if we reduce you to [00:25:46] data, then I can [00:25:48] create as many version. Maybe I already have. [00:25:50] Maybe I've created an obvious Dave [00:25:52] factory of Dave's in Australia [00:25:54] somewhere in a, a bunker in [00:25:56] Norway. And like, and, and if I said [00:25:58] to you, I've done this, I. But now [00:26:00] I'm gonna have to kill you. I'm sorry. [00:26:02] Then, um, would you find that reassuring?
I don't know, [00:26:04] like maybe someone just keep going with the [00:26:06] podcast, but like, it wouldn't feel like [00:26:08] personal survival.
Dave: I would want [00:26:10] to take two copies of me and [00:26:12] put them in a digital room [00:26:14] together, just like the Christ of [00:26:16] Plenty and have them argue about which one was [00:26:18] the real one. And yeah, it, it's [00:26:20] absurd.
Um, that said, [00:26:22] I would love to have a digital copy of me to do [00:26:24] stuff that I don't want to do. [00:26:26]
Stephen: Yeah. But he wouldn't wanna do it [00:26:28] either.
Dave: I know, right?[00:26:30] [00:26:32]
All right. [00:26:34] Our bodies [00:26:36] don't really exist. [00:26:38] Every cell in your body will be [00:26:40] replaced over the next seven years, [00:26:42] so. You are [00:26:44] really more like an Eddie in a [00:26:46] stream of matter than you [00:26:48] are a piece of meat walking around. [00:26:50] So what's to stop [00:26:52] me from replacing some neurons with [00:26:54] fiber optic circuits and still being me?[00:26:56]
Stephen: Yeah, I, I mean, it's a [00:26:58] good question, and we don't [00:27:00] really know what the limit [00:27:02] is in terms of being able to maintain an [00:27:04] organism. So I think you're, you know, you're [00:27:06] right, the organisms, of course, they're made [00:27:08] of stuff, but fundamentally they're [00:27:10] also made of processes and, and, [00:27:12] um, those are more [00:27:14] important than the stuff, the stuff gets replaced, [00:27:16] as you say.
But at the same time, [00:27:18] organisms have. Fairly, what makes them different [00:27:20] to other kinds of things, like rocks or cardboard [00:27:22] boxes, is that they're [00:27:24] self-regulating and they have a [00:27:26] distinction between self and other. You [00:27:28] know, we have a, a membrane, [00:27:30] like a skin, and [00:27:32] what's going on inside is being [00:27:34] managed actively in a way that [00:27:36] combats entropy and [00:27:38] outside isn't.
Can we introduce [00:27:40] silicon or digital or, or, you know, [00:27:42] artificial components that are [00:27:44] more resilient, but it. [00:27:46] Part of that homeostatic [00:27:48] system. I think this is an [00:27:50] open question, how far we can push that. [00:27:52]
Dave: I mean, my dad has two [00:27:54] artificial hips. I got a screw in [00:27:56] my knee. Uh, and there are [00:27:58] people walking around like Rumsfeld [00:28:00] had a heart that didn't beat.
Maybe [00:28:02] that explains some of his behaviors, [00:28:04] but, uh, you know, it was actually, [00:28:06] it's a, a continuous pump and [00:28:08] we think they're conscious and alive. [00:28:10] So I, I've seen examples and [00:28:12] I, I have no desire [00:28:14] to put. Some sort of [00:28:16] technology inside my brain, [00:28:18] unless I have ultimate control of the source code, [00:28:20] because you know that malware and spam are [00:28:22] the next step.
Mm-hmm. Uh, that would be [00:28:24] terrible. Yeah. Uh, would [00:28:26] you at any point introduce some tech
Stephen: [00:28:28] inside your brain? I [00:28:30] would wait to see how it goes with the [00:28:32] first million customers. I think [00:28:34] before I'd be willing to give it a [00:28:36] go.
Dave: I, I think you might [00:28:38] be on the right track on that. [00:28:40]
Stephen: Are you a transhumanist? [00:28:42] No. No. I, I'm [00:28:44] not, I'm sympathetic with [00:28:46] the, you know, broad [00:28:48] aims of improving the lot of [00:28:50] humanity through technology.
Obviously it's [00:28:52] gone well in many ways, [00:28:54] um, doubling human life [00:28:56] expectancy over the last couple hundred years, that [00:28:58] kind of thing. But I wouldn't [00:29:00] identify as a transhumanist. [00:29:02]
Dave: In the first decade of [00:29:04] transhumanism it, the movement [00:29:06] emerged in the nineties when the [00:29:08] cyberpunk movement emerged. And I was a computer hacker back [00:29:10] then.
I thought trans, this transhumanism stuff is the [00:29:12] coolest thing ever. And then [00:29:14] I saw some of the things I helped to create, [00:29:16] uh, in the early days of the, [00:29:18] the web and e-commerce. And I watched [00:29:20] all of them get perverted to become [00:29:22] surveillance systems and, [00:29:24] uh, basically tools of. [00:29:26] Control, like, [00:29:28] ooh, you can build [00:29:30] great tech and there will always be a bad [00:29:32] person to misuse it.
And so I, [00:29:34] I would say I'm not. [00:29:36] Hardware transhumanist. But [00:29:38] I do believe in maxing out the [00:29:40] hardware that I already have. I'm just not that interested in [00:29:42] replacing it yet. Which, which [00:29:44] means if you were to divide [00:29:46] transhumanism into two movements, [00:29:48] there's the, the wetware [00:29:50] side of things and then there's the hardware side [00:29:52] of things.
And I am profoundly on the [00:29:54] wetware side 'cause we barely know what our current [00:29:56] hard work can do. Uh, [00:29:58] so what do you think of [00:30:00] that? Division of [00:30:02] transhumanism into like an evolutionary [00:30:04] biology versus. [00:30:06] Um, or maybe directed [00:30:08] evolutionary biology versus biology [00:30:10] replacement.
Stephen: I think, [00:30:12] you know, the, the, the aim of [00:30:14] making the most of what we've got and [00:30:16] pushing the limits of technology.
I mean, this is [00:30:18] great and it's exciting. [00:30:20] Um, it, it will lead to [00:30:22] progress that, you know, hopefully I. [00:30:24] As is often the way a broad range [00:30:26] of people have access to. So I, I [00:30:28] think, you know, there are aspects of a transhumanist [00:30:30] agenda that are [00:30:32] natural extensions of [00:30:34] what we want science and technology [00:30:36] for. They're just at the cutting edge [00:30:38] and, um, and that's great.[00:30:40]
I think my, my [00:30:42] worry around certain kinds of [00:30:44] techno utopianism and transhumanism. [00:30:46] Is that all of the emphasis is [00:30:48] on the tech are not enough on [00:30:50] the values and the social and [00:30:52] political context. Obviously there's a lot to [00:30:54] say about that, about the importance of social justice, [00:30:56] but, and, and so on. I. And the [00:30:58] environment and all of that.
But, uh, [00:31:00] and, and it was a, there was a strand of [00:31:02] techno utopianism that just waves it away. It's, [00:31:04] it's a sort of, you know, let's create super [00:31:06] powerful AI and it will solve all our [00:31:08] problems for us, including all these messy things around [00:31:10] injustice and the environment and all that stuff. Mm-hmm. [00:31:12] Um, but even when it comes to personal [00:31:14] happiness that.
Evidence [00:31:16] suggests that material improvement [00:31:18] makes very little difference to personal [00:31:20] happiness. Now, that needs to be nuanced. Of course, [00:31:22] material want is very bad, [00:31:24] and being at the poorer end [00:31:26] in a wealthy society is bad. [00:31:28] But otherwise, happiness [00:31:30] does not increase in anything like what we would [00:31:32] expect in proportion to.[00:31:34]
Um, uh, [00:31:36] prosperity. And of course most of our prosperity [00:31:38] comes from technology, so this is why I'm linking [00:31:40] the two. You know, we can, we can sort of create [00:31:42] this magical cornucopian world through [00:31:44] technology, but will it make us happier? [00:31:46] The answer is, well, only if we get a whole [00:31:48] lot of other stuff right [00:31:50] around personal relations.
[00:31:52] Uh, equality makes a huge difference to [00:31:54] how happy a population is and, [00:31:56] um, uh, environment we [00:31:58] live in makes big and so on. [00:32:00] So, you know, the transhumanist might [00:32:02] say, oh yeah, but I like all of that. I like equality [00:32:04] and I like the environment. It's say, no, fair [00:32:06] enough. But it is this question of [00:32:08] focus, uh, what are we giving our [00:32:10] attention to?
And I think keeping our [00:32:12] attention broadly on a broad range of [00:32:14] issues and context is important. [00:32:16] Yeah. I used to think I'll [00:32:18] be
Dave: happy when I, I, I'm rich [00:32:20] and when I was 26 I made [00:32:22] $6 million and [00:32:24] I, uh, I. [00:32:26] Looked at a friend and I said, I'll be happy when I have [00:32:28] 10 million. And yes, I was a [00:32:30] total asshole for that.
[00:32:32] Um, and I'm like, it just [00:32:34] doesn't work. And I think a lot [00:32:36] of transhumanist minded people say, I'll be happy [00:32:38] when, you know, I have robot arms. I. [00:32:40] The only reality I've come [00:32:42] to is I'll be happy when I'm happy. [00:32:44] There's nothing else I can find. [00:32:46] Is there something I'm missing?
Stephen: [00:32:48] Well, there is [00:32:50] evidence about what leads to [00:32:52] happiness.
Of course, it's all averages. So, you know, it's [00:32:54] difficult to say how it translates to a [00:32:56] particular individual. But, um, [00:32:58] uh, you know, social connections and personal [00:33:00] relations and a sense of purpose. [00:33:02] Um. You [00:33:04] know, for a lot of people in the transhumanist [00:33:06] movement, in using technology to [00:33:08] improve the human condition gives them a [00:33:10] sense of purpose.
And that's, that's fine. And [00:33:12] if that's working for them, that's great and it, it might [00:33:14] well also, you know, benefit society more, [00:33:16] more broadly. Um, but [00:33:18] for, you know, for a lot of people it [00:33:20] might mean something completely different to, you know, [00:33:22] rescuing puppies or, um, [00:33:24] what's your purpose? [00:33:26] I like, [00:33:28] um, looking after my children and [00:33:30] writing books.
And, [00:33:32] uh, though I aim with those [00:33:34] books to say something about how we can [00:33:36] make these incredibly exciting [00:33:38] technological transformations that we're living through [00:33:40] go well [00:33:42] and. But I sort of [00:33:44] can tell myself I'm doing some [00:33:46] good. Uh, so it happens. I also enjoy [00:33:48] it, um, yes, seems true of [00:33:50] hanging out with my kids.[00:33:52]
Dave: Got it. So you're [00:33:54] on the, the, the live [00:33:56] through your children or or immortality [00:33:58] through your children path as well?
Stephen: [00:34:00] Yeah, I mean, I guess with children and [00:34:02] books, I'm investing in my legacy, but [00:34:04] I feel like, you know, if. [00:34:06] If the world ended today and [00:34:08] my books and my children with it, [00:34:10] I would still feel, feel like I'd [00:34:12] spent my time well.
Dave: Mm-hmm.
Stephen: [00:34:14] That,
Dave: that's a gift. And many [00:34:16] people have asked me, what do you want your [00:34:18] legacy to be? And I'm like, [00:34:20] 500 years from now, no [00:34:22] one will know my name. It doesn't matter how [00:34:24] well known I am today. It maybe some of [00:34:26] the ideas I've helped to seed will [00:34:28] exist in some evolved form, [00:34:30] but my name won't be on them.
[00:34:32] Right. And that's okay. Like I, I just don't [00:34:34] care. And. Sometimes [00:34:36] that's, that's the, the things the media [00:34:38] trainers will tell you to, to say. And [00:34:40] I'm just trying to think why would I care about [00:34:42] my legacy? I'd loved if my kids said, you know, he was a good [00:34:44] dad. Uh, whenever I pass [00:34:46] away, they may, they [00:34:48] may not have done all I can for that, but it's [00:34:50] also up to them and whatever they go through.
So [00:34:52] I. I just find [00:34:54] profound boredom in legacy. [00:34:56]
Stephen: I, I, I wish I could [00:34:58] say I was as enlightened as you [00:35:00] Dave, because I must admit I do. I don's enlightened.
Dave: [00:35:02] It's odd.
Stephen: I, [00:35:04] yeah, well, it's [00:35:06] realistic. Um, uh, you [00:35:08] know, I've written lots about how, [00:35:10] uh, fruitless the pursuit of [00:35:12] legacy is, but I still kind of hope people are [00:35:14] reading my books once time.
Gone. [00:35:16]
Dave: Yeah. I, I mean, they, [00:35:18] you, you look back, we talk [00:35:20] about Achilles and Caesar and [00:35:22] Genghis Kahan, but it's [00:35:24] probably 20 or a hundred [00:35:26] people who are historical figures. You know, [00:35:28] Admiral Nelson, maybe Napoleon. Yeah. But like I. [00:35:30] It, it, if you're in a different [00:35:32] culture, there's a different list, but there just aren't that [00:35:34] many.
And, you know, winning the [00:35:36] lottery is more likely than being one of those. [00:35:38]
Stephen: Yeah,
Dave: absolutely. And if you [00:35:40] believe in reincarnation, it's probably the same people [00:35:42] coming back over and over on that list anyway, [00:35:44] which isn't really fair. So like, [00:35:46] we're all screwed. Uh, so I'm [00:35:48] just like, I'm not gonna do that. [00:35:50] And I.
Having had a [00:35:52] couple near death experiences, I don't talk about [00:35:54] those that much on the show. [00:35:56] Uh, people are are probably wanting, what's [00:35:58] he talking about? [00:36:00] Um, I once, um, in a [00:36:02] very elegant thing, I, [00:36:04] uh, got food poisoning. I [00:36:06] passed out, throwing up. [00:36:08] I hit my head on the floor, phased down in [00:36:10] it, and had my [00:36:12] emergency doctor, [00:36:14] um, mother of my children, former wife, [00:36:16] not heard my head hit the floor.[00:36:18]
Um, I would've done so. She had to come in [00:36:20] and, you know, give me CPR and stuff. [00:36:22] Uh, and another time after a biohacking [00:36:24] procedure gone bad in the back of an Uber. [00:36:26] Um, so I. [00:36:28] Those [00:36:30] may be flavor things for [00:36:32] me, but I wanna know from [00:36:34] your perspective, studying it, what [00:36:36] does it look like when you [00:36:38] confront your own mortality?
Hmm. [00:36:40]
Stephen: Well, it, it's [00:36:42] usually very, very powerful for people [00:36:44] and I. How [00:36:46] they interpret it, of course, varies depending on [00:36:48] their broader religious and [00:36:50] philosophical framework. For some [00:36:52] people, it's proof of an afterlife because of [00:36:54] what they see. Um, [00:36:56] but I think even if you are [00:36:58] skeptical about that, [00:37:00] then it's still a profoundly meaningful [00:37:02] experience.
A lot of [00:37:04] people. When, in [00:37:06] any way reminded of their mortality, [00:37:08] just become much more [00:37:10] grateful for every [00:37:12] day. Um, you know, the, [00:37:14] uh, in the [00:37:16] Psalms, I think in the, in the Bible, in the [00:37:18] Old Testament, it talks about and teach us to [00:37:20] number our days so that we may gain a [00:37:22] heart of wisdom. We are really bad [00:37:24] at this.
We live like we're [00:37:26] immortal so often, you know, [00:37:28] even, even though when reminded of death, we might be [00:37:30] afraid of it or when delay, whatever. [00:37:32] Um, but we don't live every day. [00:37:34] As if it might be our last or [00:37:36] as if it's something to truly be treasured and, [00:37:38] and I think when people have a near death experience, they're [00:37:40] reminded of just [00:37:42] what a gift each moment is.[00:37:44] [00:37:46] [00:37:48]
Dave: So maybe believing [00:37:50] you're going to double your lifespan [00:37:52] just fuels procrastination. [00:37:54]
Stephen: Procrastination is a real [00:37:56] risk. And, uh, it's a, it's a [00:37:58] risk already, you know, for, for many [00:38:00] people, 80 years is already [00:38:02] longer than they can get their heads [00:38:04] round in a way that allows them to get on and do what [00:38:06] they want to do.
Um, more time [00:38:08] only. Make that problem worse. [00:38:10] Um, you know, the, the [00:38:12] kid who wants to go to medical school and be a [00:38:14] doctor and help people and all of that, [00:38:16] but actually likes playing video games. [00:38:18] They found a kind of locally [00:38:20] equilibrium, if you like. They're happy [00:38:22] playing video games and you know, [00:38:24] maybe their parents keep them in pizza.[00:38:26]
Um, and they need to [00:38:28] get over a big hump to go to medical school [00:38:30] and study instead. And [00:38:32] if they know time is short, [00:38:34] I mean, it has to be long enough for [00:38:36] them to bother. Studying for however [00:38:38] many years, but you know, [00:38:40] 70, 80 years. Maybe they'd be [00:38:42] more likely to do it than if they think they've got all the time [00:38:44] in the world.
I suppose one way of putting it [00:38:46] is from an economic point of view, [00:38:48] the scarcity of a thing tends to be related to [00:38:50] the value of a thing. [00:38:52] And so with infinite time, time [00:38:54] becomes worthless. [00:38:56] Uh, with more time it seems to have [00:38:58] less value. I. Now, you know, [00:39:00] we might think that would be the wrong way to look at it, [00:39:02] but I think it would be hard.
It would be [00:39:04] work. We have to constantly remind ourselves [00:39:06] like the stoics and you know, [00:39:08] other wisdom traditions that [00:39:10] help to remind ourselves of just how [00:39:12] precious time is. I think we'd need to do that even [00:39:14] more if we had more time.
Dave: The [00:39:16] philosophers and quantum [00:39:18] physicists who explain the [00:39:20] world the best, as far as I can tell, are the [00:39:22] bio interests and you can prove [00:39:24] with math that time is entirely [00:39:26] an illusion made up by humans.[00:39:28]
Why does that make time valuable? [00:39:30]
Stephen: Yeah, I mean, [00:39:32] you know, there is a [00:39:34] philosophical and a scientific tradition [00:39:36] indeed that says, ah, [00:39:38] time. Like everything is present all at [00:39:40] once and you can go backwards and forwards and none it makes any [00:39:42] difference, et cetera. Well, what do you [00:39:44] say you can do that? What do you mean?
Because I can't [00:39:46] [00:39:48] you
Dave: practical people. Yeah. [00:39:50]
Stephen: [00:39:52] You know, I can't just go back and change what I did [00:39:54] yesterday or, um, you know, [00:39:56] to, so. I mean, [00:39:58] there might be ways in which this is [00:40:00] true, and that might also correspond to certain kind of [00:40:02] mystical experiences of, uh, [00:40:04] the fundamental nature of reality. [00:40:06] But the lived [00:40:08] reality for most people, most of the time is not [00:40:10] like that.
Time
Dave: is very [00:40:12] much perceivable when you're in a meat [00:40:14] body, and it very much doesn't [00:40:16] exist when you use science. [00:40:18] And that's one of those things where [00:40:20] it just, it throws me for a loop. [00:40:22] Um, but I've, I've decided that, [00:40:24] uh. I'm basically hallucinating [00:40:26] time all the time and I'm okay with that 'cause it's a [00:40:28] useful hallucination.
There are [00:40:30] studies that show in [00:40:32] hospice if you give people [00:40:34] psilocybin, psychedelic mushrooms, even [00:40:36] one time, their fear of [00:40:38] death dramatically reduces. [00:40:40] Why? [00:40:42]
Stephen: Yeah, well, I, I'm, I'm not [00:40:44] an expert on this, but I do find it [00:40:46] very interesting and I [00:40:48] think it fits well. It, [00:40:50] it induces certain kinds [00:40:52] of experiences that some people might describe as [00:40:54] mystical that certainly seem [00:40:56] to suggest a different way of [00:40:58] perceiving the universe [00:41:00] reality and one's place in it.
[00:41:02] But it's very different to being stuck in [00:41:04] time as we were just talking about, [00:41:06] and the kind of having the kind of [00:41:08] egocentric concerns that we do. [00:41:10] Um, and seems to open up a different [00:41:12] way of seeing that [00:41:14] suggests we are more [00:41:16] one with the rest of the universe, which [00:41:18] in a sense we are of course, like we're all, you [00:41:20] know, we're all made of stardust.
[00:41:22] As, as the saying goes, we're, we're all made of the same [00:41:24] stuff as universe. We come from it, we'll return to [00:41:26] it. There is a plausible way [00:41:28] of describing us much as you did earlier. [00:41:30] Dave, that we are Eddie. So, [00:41:32] you know, we're, we're waves on the sea. We're just [00:41:34] as, and I think those kind of [00:41:36] psychedelic experiences like [00:41:38] meditation and certain, you know, other ways of [00:41:40] accessing what we sometimes call mystical [00:41:42] experiences can help to [00:41:44] reduce that sense of ego and of [00:41:46] self.
And then we're [00:41:48] less worried about the end of that [00:41:50] self. That is death because we [00:41:52] identify with this, you know, broader [00:41:54] vision.
Dave: How likely [00:41:56] is it? That at some [00:41:58] point in the, the future, probably the [00:42:00] near future AI will [00:42:02] become a god for humans.
Stephen: [00:42:04] Yeah. I'm not a betting [00:42:06] man. Um, but [00:42:08] uh, those who are are [00:42:10] more inclined to bet on that [00:42:12] now than they were five years ago.
Um, [00:42:14] I mean, we're seeing incredibly rapid [00:42:16] progress of course. And, um, the [00:42:18] story some people tell around that, [00:42:20] of. An [00:42:22] intelligence explosion is that, well, [00:42:24] you know, we're now starting to use [00:42:26] AI to code for us. We're using [00:42:28] AI in scientific discovery and [00:42:30] technological, um, [00:42:32] prototyping and other kinds of, you know, [00:42:34] progress.
And once [00:42:36] AI starts to get better at lots of [00:42:38] aspects of that than we are, [00:42:40] then we'll see, you know, it will take [00:42:42] over that for us. It will build even smarter AI [00:42:44] and so on. And so we see this intelligence explosion [00:42:46] and so very rapidly we [00:42:48] see ai. So [00:42:50] powerful that it is effectively like a God [00:42:52] to us. So that's a story.
[00:42:54] It's um, it's, it's a, [00:42:56] it's a plausible story. Uh, [00:42:58] and you know, people in the know are [00:43:00] thinking this could happen in the, in the next [00:43:02] decade, so it's worth taking [00:43:04] seriously, but what kind of [00:43:06] gods there'll be and how we relate to them [00:43:08] is a very open question. [00:43:10]
Dave: I hope Mo Garda, who's been on the [00:43:12] show is right and he says that if [00:43:14] we treat them with kindness, they'll [00:43:16] learn to treat us with kindness.
[00:43:18] So my number one [00:43:20] advice for people on that front is don't be [00:43:22] mean to ai 'cause you don't want it to be mean to [00:43:24] you later. What do you think of that advice? [00:43:26]
Stephen: Certainly, I think we. [00:43:28] Obviously World War One's [00:43:30] building it. So to an extent it's gonna be in our [00:43:32] image. I mean, of course that's true at the [00:43:34] moment of, you know, the most widely used AI [00:43:36] systems.
Currently large language [00:43:38] models are literally just trained on, you know, [00:43:40] the stuff humans put into the [00:43:42] internet. So they're very much [00:43:44] reflections of ourselves, [00:43:46] and that's gonna continue to be the case. [00:43:48] Now there might come a point, of course, they're still [00:43:50] very alien, right? They're not like [00:43:52] humans, but they are nonetheless.[00:43:54]
Uh, reflections of us and our values [00:43:56] in, in many ways, that's likely to be [00:43:58] continuing. And, and I think that's true of our [00:44:00] fears. You know, people who [00:44:02] fear, um, [00:44:04] you know, the Terminator type, [00:44:06] uh, machine demon rather [00:44:08] than machine. God, the one that wakes up, it immediately [00:44:10] turns on us. Well, why [00:44:12] do we fear that?
Well, of course that's what we've been [00:44:14] doing to each other, you [00:44:16] know, and, you know, a hundred years ago, well, [00:44:18] if you. Read HTLs War [00:44:20] of the World for example's. Wonderful. [00:44:22] Um, evergreen book, [00:44:24] um, 125 years old now. [00:44:26] Um, it, he's very [00:44:28] explicit that he's sort of satirizing, if [00:44:30] you like, the fact that some people [00:44:32] claiming to be more intelligent than others [00:44:34] use that intelligence [00:44:36] and their technological advantage [00:44:38] to decimate other people.
[00:44:40] On the planet because they considered them [00:44:42] inferior. And he's very explicit that this is, you [00:44:44] know, his story of aliens who were more intelligent [00:44:46] technology about doing that to us is a fable of [00:44:48] what well in the, you know, said in [00:44:50] England of what the English did to other other people. [00:44:52] And so of course, you know, we have this long [00:44:54] history and so we project and we think [00:44:56] ai, if it's more intelligent than us, [00:44:58] surely it'll regardless us as [00:45:00] inferior, it will try to exterminate us [00:45:02] just like we did to all this [00:45:04] other people, you know?
Um, [00:45:06] but of course there are different visions. It doesn't need [00:45:08] to be like that. There are. Different visions of what [00:45:10] intelligence is more [00:45:12] closely associated with the wisdom and [00:45:14] enlightenment. You know, I, we build an [00:45:16] ai, it's super powerful. The first thing it does is just sit [00:45:18] under a, a banyan tree and [00:45:20] meditate for eternity.[00:45:22]
You know, why should, why should we think it [00:45:24] would do these very [00:45:26] human-like things? Um, but, [00:45:28] uh, but broadly speaking, yes, we [00:45:30] do need to be very careful [00:45:32] about the kind of values we're instilling in these [00:45:34] machines. It's a risk and an
Dave: [00:45:36] opportunity. I'm, I'm a hundred percent with [00:45:38] you there. You said that you're not [00:45:40] a betting man yet.
You're a [00:45:42] philosopher and you choose to write a book about a [00:45:44] topic versus another topic. [00:45:46] How can you not be a betting man?
Stephen: [00:45:48] Yeah, I mean, it's a fair [00:45:50] question, I think, um, [00:45:52] well, one interesting [00:45:54] movement, um, [00:45:56] who's grown in the last few [00:45:58] decades is, uh, around [00:46:00] thinking about [00:46:02] relatively low probability. [00:46:04] But very high impact events
Dave: of [00:46:06] the black swan.
Stephen: So black swan events. [00:46:08] Exactly. And so, and so team [00:46:10] members in my institute in Cambridge that, [00:46:12] um, think about existential [00:46:14] risk as they worry about the end of [00:46:16] civilization. And, um, their [00:46:18] argument for worrying about things that other people don't [00:46:20] worry about is, well, it might seem pretty [00:46:22] unlikely, but it's not impossible.
And the [00:46:24] consequences would be so terrible that [00:46:26] it's worth having a few people thinking about it. [00:46:28] And, uh, that's how I [00:46:30] think about. Say, working [00:46:32] on ethics and impact of life extension, [00:46:34] we don't know that there'll be a breakthrough. [00:46:36] Um, lots [00:46:38] of civilizations in the past have thought they [00:46:40] were on the urge of a breakthrough on the, on the verge of [00:46:42] a breakthrough, and they were wrong.
But [00:46:44] you know, now I think there's [00:46:46] reason to take the prospect seriously, [00:46:48] because, you know, we've had [00:46:50] clear breakthroughs in our understanding [00:46:52] of biology, decoding the [00:46:54] genome, and not to mention we can [00:46:56] significantly. Extend the [00:46:58] lives of other organisms in the lab. [00:47:00] So that's about as clear proof as [00:47:02] one could hope for.
So it's [00:47:04] worth taking it seriously and thinking [00:47:06] about how to make that go well. Like, [00:47:08] you know, if I'm thinking about the ethics and impact, it's not [00:47:10] because I'm against it, it's not because I'm [00:47:12] pessimist, because I want it to go [00:47:14] well. So I guess, [00:47:16] you know, that's a, that's a little bit of a bet there. [00:47:18] I'm betting that the chances are high [00:47:20] enough for it to be worth investing [00:47:22] Some thought.
Dave: I always [00:47:24] wonder about that because, [00:47:26] uh, you know, some people will, will say, well, I, I would [00:47:28] never gamble, but I think we're all [00:47:30] probability machines at [00:47:32] some part of our consciousness because you gotta go [00:47:34] left or right, so you bet on left being [00:47:36] the right way to go. Like, and, and we do it [00:47:38] unconsciously. Uh, so I just wanted [00:47:40] to get your, your [00:47:42] philosopher's take on that.
And thanks for, thanks [00:47:44] for being flexible with it. If we [00:47:46] can defer [00:47:48] death. What kind of [00:47:50] psychological or spiritual [00:47:52] crisis that's new would [00:47:54] emerge to replace death? [00:47:56]
Stephen: [00:47:58] Hmm, [00:48:00] interesting. To replace death. [00:48:02] Well, I. [00:48:04] Of new crises might emerge to [00:48:06] replace, like we have our
Dave: middle age crisis [00:48:08] where you have people go out and get a sports car, [00:48:10] whatever.
Is there sort of like a, [00:48:12] you know, end of normal, [00:48:14] uh, primitive man life at around [00:48:16] a hundred? You know, you, you get two sports [00:48:18] cars or you. [00:48:20] Shave your head and join a [00:48:22] convent? Dude, I don't know. Like, like what? [00:48:24] Like where in your philosophers at you're, you've thought [00:48:26] about this more than I have.
Stephen: Okay. No, no, that's, [00:48:28] and that's a good question now.
Uh, but I think it [00:48:30] does depend very much on the scenario. So I think,
Dave: [00:48:32] yeah,
Stephen: if we were made, I more immortal, like [00:48:34] if the, if the genie comes. [00:48:36] And says, Dave, I've made you immune to [00:48:38] death. Like literally you can't [00:48:40] die. It doesn't matter what happens.
Dave: Ooh, happens [00:48:42]
Stephen: then I think your crisis would be, [00:48:44] oh my God, what happens when everyone else [00:48:46] is dead and I'm still floating by myself through the [00:48:48] universe.
Um, [00:48:50] so I think the prospect of the [00:48:52] genuine prospect of immortality, if it, [00:48:54] if, if that's what we were talking [00:48:56] about, would. He [00:48:58] could easily engender a crisis, [00:49:00] but if it wasn't genuine [00:49:02] immortality, if it was, um, [00:49:04] say instead we defeated aging [00:49:06] and disease, we gave you, give you a kind of elixia of [00:49:08] life and you can stay, you know, your [00:49:10] current, um, [00:49:12] state of health [00:49:14] indefinitely, um, then.[00:49:16]
Your chance of dying [00:49:18] would be, would be about the chance of [00:49:20] accident, which would introduce a kind of radical [00:49:22] uncertainty and I think would [00:49:24] very much shift your attitude to risk. [00:49:26] So, you know, the crisis would be [00:49:28] every time you go out of the door. Is it, is it [00:49:30] worth it? And, and how do you [00:49:32] think rationally about a life that might be [00:49:34] ended by a bus tomorrow, [00:49:36] but might last for.
Well, [00:49:38] you know, a thousand years or [00:49:40] something like, it becomes very hard to think [00:49:42] clearly. I think about, [00:49:44] um, the, the, the [00:49:46] uncertainty about what a normal life would look [00:49:48] like under those conditions.
Dave: When you came [00:49:50] up with that scenario of, you [00:49:52] know, living forever because a genie said [00:49:54] that, I like to think that [00:49:56] I have fully gotten over my [00:49:58] oppositional defiant disorder.
[00:50:00] But if someone told me, you are [00:50:02] now cursed with never [00:50:04] dying, my immediate. [00:50:06] Response would be to start researching how to die [00:50:08] anyway, just because having [00:50:10] the option is valuable at some [00:50:12] deep level of unconsciousness. I didn't even know until you [00:50:14] said that.
Stephen: Yeah.
Dave: So it it, [00:50:16] it may be just a, a reflection of [00:50:18] that.
Well, you tell me I'm gonna die. Well then I'll do it on [00:50:20] my own terms. So you tell me I'm gonna live well, then I'll do it [00:50:22] on my own terms. Do you think there's some [00:50:24] part of humans [00:50:26] intimately tied to doing it on our own [00:50:28] terms that's necessary for all [00:50:30] this death to even matter?
Stephen: I [00:50:32] mean, certainly our autonomy is [00:50:34] very important to us.
I think, you know, [00:50:36] we have evolved to [00:50:38] have a great deal of [00:50:40] freedom, of will, of autonomy, of [00:50:42] a certain kind. Like we're very good at [00:50:44] generating options and exploring those and [00:50:46] thinking about the right thing. And it's really [00:50:48] fundamental to our thinking [00:50:50] process. I mean, it's more important for some than [00:50:52] others.
And you know, some are real [00:50:54] explorers and innovators and others just, you [00:50:56] know, wanna. Pursue a [00:50:58] well-defined track and, and that's, you know, we need that [00:51:00] variation and so that's good. But [00:51:02] I think there is something. [00:51:04] Fundamental to being human, that [00:51:06] around autonomy. And of course [00:51:08] we see that built into our political traditions.
And [00:51:10] you know, you know, why do we enjoy [00:51:12] living in liberal democracies? Because we [00:51:14] en enjoy that. Um, and I [00:51:16] think because of that death can seem [00:51:18] like an outrage, like the ultimate [00:51:20] insult, because why can't [00:51:22] we do something about it? Why can't we choose? [00:51:24] But just as you say, if immortality was imposed [00:51:26] upon us, that might seem equally [00:51:28] outrageous.
Why
Dave: do people lie [00:51:30] about death? Oh, they passed [00:51:32] away. They're in a better place. And all these [00:51:34] other words, instead of they [00:51:36] died.
Stephen: I, I think, well, [00:51:38] you know, humans have a tendency to death to [00:51:40] deny death. Uh, I think [00:51:42] because we don't like to face up to our. [00:51:44] Own mortality and, you know, the [00:51:46] end of all of our projects and everything turning [00:51:48] to nothing.
But nor do we like to [00:51:50] face up to the death of loved ones. Of course. [00:51:52] It's extremely painful. It is a [00:51:54] very, very, um, [00:51:56] challenging and long process of [00:51:58] adjustment to get used to a loved one [00:52:00] not being around anymore. [00:52:02] And so we, you know, build these [00:52:04] elaborate, um, belief [00:52:06] systems that, that, that, that [00:52:08] tell stories, that make that a lot [00:52:10] easier.
[00:52:12] And they are often about, you know, [00:52:14] passing to another realm and so on. [00:52:16] Um, whereas the word [00:52:18] deaths can, it seems brutal. [00:52:20] It feels like the end of something. [00:52:22] Um, you know, both, both [00:52:24] for ourselves and, and for our loved [00:52:26] ones. And, um, [00:52:28] that's, that, that's very challenging for [00:52:30] a lot of people. What's the most [00:52:32] dangerous
Dave: idea in the longevity [00:52:34] movement right now?
Stephen: I think the most [00:52:36] deluded is mind uploading for the reasons we've [00:52:38] mentioned. Well, but actually [00:52:40] duplication is just one problem. It's the most obvious problem. [00:52:42] So it is a film out now, which I, I, I've watched, [00:52:44] but I've for, forgot the name of it. Mickey 17 [00:52:46] possibly. But anyways, a film about, something [00:52:48] about mind uploading, it involves duplication.
It makes very [00:52:50] clear what duplication problem is, but the [00:52:52] replication process is perfect. Like [00:52:54] they're both exactly the same. And they're both [00:52:56] exactly like Mickey was. This is [00:52:58] incredibly unlikely. When we [00:53:00] start uploading, when we start [00:53:02] experimenting with these kind of, uh, digital [00:53:04] twins, these avatars, they're gonna be [00:53:06] rubbish.
They, they're gonna, you know, maybe sort of just [00:53:08] like you without a sense of humor or, you know, just like, [00:53:10] or, or, you know, without, you [00:53:12] know, whatever you, lots of things you think are [00:53:14] important. So we're gonna get it wrong in lots of [00:53:16] ways and in, in a sense, or for naught, because I [00:53:18] don't think it is a kind of survival.
So I think that [00:53:20] was the most deluded, [00:53:22] the most dangerous, probably the most [00:53:24] dangerous. Now I've given myself a bit of time to [00:53:26] think is, um. [00:53:28] Accelerationism in the [00:53:30] sense of we just need to throw everything [00:53:32] into the tech and any [00:53:34] kind of ethical worry, any [00:53:36] kind of regulation, any kind of thought about [00:53:38] the impact we just need to put to [00:53:40] one side because we just need to [00:53:42] build super powerful ai, et cetera, et [00:53:44] cetera.
'cause that will solve our problems for [00:53:46] us. I think, you know, including [00:53:48] all the longevity problems and everything else, you know, that would, [00:53:50] the AI will solve living forever. [00:53:52] I think that's the most dangerous [00:53:54] because. For reasons we talked about. [00:53:56] We need to make sure these things have the right values. [00:53:58]
Dave: Okay.
[00:54:00] Speaking of values, [00:54:02] if you could program one [00:54:04] idea about death into every [00:54:06] ai, what would it be?
Stephen: [00:54:08] Well, we've talked a bit about, I. [00:54:10] The importance of [00:54:12] accepting our own mortality. I [00:54:14] think that would be every bit as important [00:54:16] for an ai. [00:54:18] Ah, so there are lots of worries, of course, [00:54:20] about AI that suddenly [00:54:22] develops as a kind of instrumental [00:54:24] goal, self preservation.[00:54:26]
So, you know, you, you have a coffee making [00:54:28] ai, it works [00:54:30] out that to make the perfect latte. [00:54:32] It. Like it can't be turned off [00:54:34] 'cause it can't make a latte tape. It's turned off. So [00:54:36] it sort of, you know, develops this [00:54:38] urge to preserve itself [00:54:40] and you know, and this goes badly and it [00:54:42] ends up with the Terminator and what have you.
So [00:54:44] just like, I think we need to accept our infinitude. [00:54:46] So does the machine, we need a machine that [00:54:48] is fairly indifferent [00:54:50] about its own survival [00:54:52] and generally, yeah. Yeah. [00:54:54]
Dave: I'll give it there. Wow. So you would teach [00:54:56] AI to be ambivalence about. [00:54:58] It's death and [00:55:00] that, that's intriguing and [00:55:02] probably a really good idea for keeping [00:55:04] humans around.
I, I really [00:55:06] like that. So all the AI peeps [00:55:08] listening. That's cool. I [00:55:10] have one more question for you. I know [00:55:12] it's late for you in Cambridge. I'm [00:55:14] just here in Austin where it's still earlier. [00:55:16] What would you say to a [00:55:18] 25-year-old who thinks death is entirely [00:55:20] optional?
Stephen: I would say [00:55:22] it's not, not yet. [00:55:24] And you know, live healthily [00:55:26] don't.
Do anything too [00:55:28] stupid 'cause you'll regret it, you [00:55:30] know, when you're 50, 60, 70, whatever. [00:55:32] But our [00:55:34] time is limited and it's hard to [00:55:36] remember that when you're 25, [00:55:38] um, and feeling [00:55:40] immortal. But our time is [00:55:42] limited and it's only by [00:55:44] reminding yourself that. Every [00:55:46] day that you won't [00:55:48] just play video games and eat cold [00:55:50] pizza, but you will get up and [00:55:52] do the things that really matter to [00:55:54] you and will make a difference in the [00:55:56] world.
Dave: It's profound advice. [00:55:58] I had the [00:56:00] unusual gift [00:56:02] of being, [00:56:04] having the diseases of aging, [00:56:06] most of them in my twenties. Uh, [00:56:08] and. Having [00:56:10] to consider things that maybe [00:56:12] most 25 year olds don't think [00:56:14] about and didn't look like a gift then, but [00:56:16] it did make me a little bit [00:56:18] more thoughtful about what I do with my [00:56:20] life.
So I think it's, it's [00:56:22] profoundly good advice. And [00:56:24] it's actually part of why biohacking [00:56:26] exists, because I ran a longevity [00:56:28] nonprofit group in the late nineties and I [00:56:30] couldn't get anyone under 50 to show up [00:56:32] to the meetings. But they sure show [00:56:34] up for biohacking meetings [00:56:36] where the same tools that make old people [00:56:38] young, make young people powerful and healthy.[00:56:40]
So there was alignment, but it [00:56:42] needed a different message. Stephen [00:56:44] Cave, your work [00:56:46] is. Deep and [00:56:48] profound and ex just [00:56:50] exciting and fun to read and [00:56:52] ponder. So thank you for spending [00:56:54] all of the time thinking, [00:56:56] pondering, philosophizing, and writing that you do. [00:56:58] I think you're making a difference. [00:57:00]
Stephen: Thank you, David.
It was great to chat with you. I [00:57:02] really enjoyed it. [00:57:04] See you next time on the [00:57:06] Human Upgrade [00:57:08] Podcast.