Connect with us


Seeking Immortality (with Paul Bloom)



Seeking Immortality (with Paul Bloom)

Intro. [Recording date: March 27, 2024.]

Russ Roberts: Today is March 27th, 2024, and my guest is psychologist Paul Bloom of the University of Toronto. His Substack is called Small Potatoes, and I love it. This is Paul’s sixth appearance on EconTalk. He was last here in December of 2023, talking about whether artificial intelligence can be moral.

Paul, welcome back to EconTalk.

Paul Bloom: Glad to be back. Thanks for having me.

Russ Roberts: I want to let listeners know that this episode may touch on some adult or dark themes. You may want to listen in advance before sharing with children.


Russ Roberts: And our topic for today is a recent essay on your Substack. The title was “Be Right Back,” which described a scenario for the future–a scenario I would call a certain kind of immortality–that you Paul called a ‘blessing and an abomination,’ which I thought was the perfect framing of what is perhaps, I think almost certainly coming for us in the afterlife that we’re about to experience. What is that, Paul?

Paul Bloom: I like the terms ‘abomination’ and ‘blessing.’ There’s a nice sort of Biblical resonance to them. And, what I’m imagining is a world in which artificial intelligence [AI] is capable of mimicking real people. And, there’s all sorts of usages you could imagine this having. I think a lot of people would enjoy having celebrities in their house they could interact with on their phones or on their whatever–their Alexa system.

And, I think people would enjoy connecting to friends of theirs, family members who are out of town, who, in the middle of the night you wake up, you want to talk to your wife or your husband, they’re asleep. So, you just kind of start talking to the simulation.

I’m most interested in and most troubled by–and I think we’re going to talk about this a bit–in cases where these AI simulations are people who have died.

And, my inspiration for this–my inspiration for the title of the Substack, “Be Right Back”–is a Black Mirror episode.

And, you and I talked about this: we don’t want to have spoilers for the episode. People should just, if they have Netflix, they could watch it.

But, the setup for this is: a woman–a young woman’s husband just dies suddenly. And, in this sort of alternative future, she has the ability to have a simulation of him created. And, it ends up with–and I’m not telling you more than what’s in the trailer–but it ends up with the simulation being a robot version of him which is indistinguishable from the original.

But, the part I was most interested in, because it’s most plausible, is the simulation is just online. So, they upload all the videos of him, everything he’s written, everything he’s commented on, all his DMs [direct messages], and texts, and so on. It establishes personality that way. And then she could talk to him over the phone, and she finds this–at first, she greets this possibility when it’s raised to her with horror and disgust, the hallmarks of an abomination. But later she finds it addictive, tremendously moving, powerful; and it takes away some of her grief.

And, since this does not seem tremendously futuristic, it seems we’re close in certain ways, I want to talk about what that would be like and the implications.

Russ Roberts: It reminds me–I’ve not seen the Black Mirror [Russ accidentally calls it Dark Mirror] episode, but I think it’s nice because Black Mirror is, in theory I suppose, a science fiction series. We’re very close to living in a science fiction world, rather than imagining it or reading it.

There’s a movie I love–or at least I loved a long time ago when I saw it–called Starman.

[BEGIN SPOILER ALERT] And again–there’s spoilers coming–it’s about a young widow who is mourning her husband, and I think the movie opens with her drinking a glass of wine late at night, watching home movies of her husband–who is gone. He has died young, tragically.

And, she’s drunk because she can’t deal with this loss. And, a alien creature slips in through a window–it comes in like a star beam, a beam of light–goes through a family album and finds a lock of his hair, I think is what this creature finds. And, within a very short period of time, the widow finds herself in the presence of a perfect, cloned DNA [Deoxyribonucleic acid] replica of her husband. But of course, without his memories–it’s a twist on the Black Mirror version–she is of course, deeply attracted. This is not a robot, though. It’s a flesh and blood creature. It looks just like her husband; and we watch as she and this clone get to know each other, although the clone’s motive is not the same as hers. The clone is are on a mission from outer space.


So, you can watch that if you want.

But the power of that movie, and I think the power of Black Mirror and what we’re going to talk about today is: as human beings, our finitude, our mortality is unbearable. And, religion provides, or used to, I think much more effectively, some solace for that with the idea of an afterlife, or reuniting. How many movies are there that exploit this human urge?

One of my favorites is Heaven Can Wait, which is a magnificent–I love that movie. But, there’s–I think it’s called Truly Madly Deeply [Russ accidentally says Clearly Sadly Deeply]. That movie was so powerful. Alan Rickman is the star of it. I watched it once and I can’t watch it again. It’s a masterpiece. It’s too sad.


Russ Roberts: So, as human beings, we cannot cope with our mortality. We long for immortality. And, as human beings, we do it many different ways. Our children, our books, the memories of the people who are alive when we go. But, this is a different level; and it’s different.

Paul Bloom: So, there are two questions I could ask you about this–both personal. But, the first is too sad and I won’t even ask it, which is: would you want this of the people you love now, if you were to lose them? And I mean, honestly, when you lose them because we all lose people. What would you think of such a substitute?

But, I’ll ask you a different question, which is: If I go online, I could see hours and hours and hours of conversation with you, and video with you–more than just about anybody I know. So, imagine–and it’s not crazy–we upload all of that and we could have a Russ Roberts simulation. And it would be a little bit–it wouldn’t be the way you would normally talk to your family, and so on. It’s a little bit more talking to people about intellectual matters. But there’s a lot of stuff there, and it would be a pretty good simulation.

Would you participate in creating one for when you die, for the people you leave behind?

Russ Roberts: Can I answer the first one, also?

Paul Bloom: Yeah, definitely.

Russ Roberts: One of the things I love about having Paul Bloom as a guest is that he sometimes asks questions, which means I get to be the guest, which I appreciate–lonely here as the host for 940 episodes.

Paul Bloom: I take away some of the responsibility.

Russ Roberts: What?

Paul Bloom: I take away some of the responsibility.

Russ Roberts: Exactly.

Paul Bloom: I should get a share of revenues from this episode.

Russ Roberts: Yeah. We’ll talk about that.

Paul Bloom: We’ll talk about that.

Russ Roberts: Yeah. The first question I think is a very profound question, obviously. And, what’s beautiful about your essay is it is a speculation about what might be coming quite soon, actually. But, it forces you to think about life, not just simulations of death.

I lost my father about four years ago. I was very close to him. And I had dreaded that day for a long, long time.

I was so close to him when I was younger, I was afraid–much more of his death, obviously, than my own–but just worried that I would struggle to cope with a world without my dad.

And, when he got older, and sick, and lost some of his mental capability, I was surprised after his death how little I missed him. I still miss him, but I thought it would be much stronger.

And, part of the reason I didn’t miss him as much as I expected, is that when he died, he wasn’t the same person he was when he was younger. I didn’t miss the man who passed away at 89 years old. I missed the man who was the 50-year-old–or even better, the 40-year-old father of me as a young boy or as a young man–who I turned to, who I wanted his approval, and so on.

And so, if you said to me, ‘Would you want a recreation of your father?’ I would say, ‘Yeah, but not the one at the end.’ Don’t use all the data.

And, you give this example about tweaking the avatar–the simulation. I want the 40- to 58-year-old Dad of mine, who was funny and wise and treated me a certain way–differently than he did at the end, in the last, say, five to 10 years of his life.

So, that’s the first thought I have about that.

The second question is: You know, people have already done what you’re talking about. They’ve uploaded transcripts from podcasts or writing of people. And, at the current level it’s not very good–

Paul Bloom: No, it’s not good at all–

Russ Roberts: I can have a conversation with Adam Smith based on his books that are online. And, it’s disappointing. It’s not very interesting. But it will get better. In fact, it will get better and better and better.

And, the idea of whether–you posed the incredibly creepy, but I think inevitable dilemma: How much of my life would I spend preparing for others to enjoy this? How much of my own day-to-day life would I record so that my loved ones, when I’m gone, or strangers–forget my loved ones–strangers could enjoy my character, my persona? And, that’s just a–I’m not sure it’s an abomination, but it’s a creepy thought. What are your thoughts on those two issues?

Paul Bloom: It wouldn’t be–I don’t see it as much work, actually. I can imagine us just carrying around an unobtrusive recorder as we go about our lives and talk to our children and our partners and our friends. Sort of like a podcast, living one’s life as a podcast, but just collecting a lot of data.

And, I mean, there are two questions. The second question: would I participate in leaving something behind?

If I felt the people who are close to me would want that, I see some negatives. I see it. I would be–I’m most sort of troubled and curious about children, about young children. I have older children who are out in the world and maybe they would enjoy being able to go onto a computer, have a little conversation with the version of me once I’ve passed.

But, imagine a child whose mother or father has died at age five or 10.

I tell the story in a Substack of my own–I don’t tell this– the first time I’ve ever talked about this–but my mother died when I was 10, of–and I was extremely close to her. And I think I would have wanted to[?] simulate–to hear her voice again. And I’m old enough so that for her, there weren’t–I couldn’t get a QuickTime movie. There was not a whole lot of a video that all of us leave behind now. So, I’ve never heard her voice again once I had my last conversation with her.

But, would it have been good for me? Or would it have sort of blocked a grieving process in some way?

Suppose my wife dies and I have a simulation of her [?] and I just enjoy talking to her, I’d have the same conversation–she knows me, or appears to know me–as well as my wife does. Would I ever seek anybody else? Or would I just spend my time talking to her?

And, I don’t know the answer. I think those are sort of two separate questions: What would we want? And then, the second question is what’s good for us?


Russ Roberts: The other thought I had-and I’m going to phrase this about someone else; it’s funny how it’s hard to talk about it, about you or me–imagine someone who loses their spouse, has the avatar available, or created, the chatbot. And, again, it could be on your phone: You talk about it like you’d call them–but of course you’d call them up. They will be hovering in the room with you in 3-D hologram form with all the gestures, just like the Starman DNA thing. They’ll have the physical features: again, you’ll pick the year that one wants one’s spouse to come back from–

Paul Bloom: And, you know what we’ll have, which we can do already. It’ll have the voice.

Russ Roberts: Yeah.

Paul Bloom: The voice, not just the voice as if in a mechanical sort of replication–

Russ Roberts: Not Alexa–

Paul Bloom: The same–yeah, the same cadences, the same use of vocabulary. You know, I hear you for 10 seconds; I know it’s you. And, I could understand intellectually that an AI, even right now–I think there’s been famous cases of this–can fake it. So, I’m not talking to Russ Roberts, I’m talking to an AI deepfake. But, I think that will be so compelling.

[POSSIBLE SPOILER] And, in the Black Mirror episode, when she stops sort of messaging him over the computer and he says, ‘I can give you–if you want, you could talk to me.’

She hears his voice and then breaks down and is so moved; and watching it, we’re moved, too. [END SPOILER]

Sorry I cut you off.

Russ Roberts: No, that’s all right. I was going to say two things.

One, my children–one of my children in particular–is a superb mimic, and we don’t need the avatar if they want to hear me. And, he knows all my catchphrases: he knows it better than I do, because he’s noticed them. I won’t reveal it, but my children have a WhatsApp group named after one of my common phrases. And, until I became aware of that, which was I think an accident: I’m not in the group, obviously–

Paul Bloom: No, of course not–

Russ Roberts: it’s for them to talk about me. I didn’t realize, ‘Oh yeah, I say that a lot,’ and they’ve figured that out and made that the name of the group.

But, what I was going to say about the spouse–and this for me is where it gets particularly interesting and dark.

Let me introduce my thought on this with a story. I went to a memorial service and a woman had lost her husband of 70-plus years of marriage, and for a reason not worth going into, I was friendly with both of them, but I was not close friends.

And, the woman revealed something very personal to me, partly because I think I wasn’t a close, close friend. She said–this memorial service took place after, some time after the death of the husband–she said, ‘I talk to him all the time.’ And I said, ‘Of course you do.’

And she said, ‘My friends think I should stop doing that. I should get over it.’

And I said, ‘I don’t think so. I think it would be weird if you didn’t talk to him. You talked to him for 70 years. Why would you stop?’

But, what I’m thinking is that that’s a particular case, people in their 90s.

But, if God forbid someone lost a spouse fairly young or certainly at midlife, and you had that avatar, and you’re saying, would that delay you from entering out and dealing with your grief?

But, there’s a second possibility, which is if you then remarried, would you not be tempted, and maybe encouraged by your new spouse to continue that relationship?

Just like people who break up with their spouses or divorce, either, or separate from their partners, say, and I’m always–I guess it varies tremendously by the relationship of the person–but they say, ‘Oh, we’re still friends.’

And, often the person who is the new spouse–the new partner–resents that. Sometimes they encourage it, sometimes they resent it. But, imagine if on your watch, one could chat with one’s former spouse about their problems, including their problems with their new spouse or new partner, and say, ‘I’m having trouble,’ because often our relationship with our friends are as an ear, a shoulder to cry on. Anyway, that’s coming. I don’t think there’s any doubt that that is coming.

Paul Bloom: I don’t know.


Russ Roberts: But, my view is–the way I phrased it; you phrased it a different way–I phrased it, and we’ll come back to this. I’m not ready to talk about this yet, Paul, but I think the right question is: Can we possibly resist this future? Should we, which is the question I think you were asking. And, if we should–if we decide that this is not healthy for ourselves, is it possible? I’m not sure it’s possible. When I think about the seductiveness of screens and social media–let’s move away from spouses.

If I have a chance to hang out with Adam Smith–let me give you three scenarios and let you react to it. So, at the end of my book on Adam Smith, I imagine having a drink with him. And, I love that idea, right? So, imagine I could conjure up his avatar, and have that drink and we could talk about tariffs, we could talk about why he at the end of his life was a customs official.

I can find out more about David Hume, right? And, this is a world where Adam Smith is more than just the collection of his writings. This is the Adam Smith whose every conversation he’s had with David Hume has been recorded and saved. And so, I can find out about not just hanging out with David Hume–of course, I guess I could have both of them over; and I could watch their interaction, which would be charming, and I get to be their new friend, and so on. So, that’s one level.

The second level is: I say, ‘Paul, I literally enjoy talking to you and it’s a shame we only talk every three or four months. Could we have a Zoom relationship where we have a drink together now and then?’

And, I have a friend I do this with: we have coffee now and then. We don’t do it much since I moved to Jerusalem because of the time difference, but I used to talk to him now and then. So that’s Level Two.

So, Level One is a fake, but maybe really real Adam Smith: hard to describe it as fake if it has absorbed all of his interactions with his friends, his mom, and his writing.

Second level is: I’m hanging out with you, but it’s over Zoom and I can’t smell how smoky your scotch is.

The third level is: I have a friend here I like to have a l’chaim with now and then–maybe he’s not as interesting as you, Paul. So, instead of seeing him in the flesh, I hang out with Paul Bloom and I hang out with Adam Smith. And eventually, maybe Paul–I hate to say it–I might also want to hang out with someone else, not just Adam Smith, but you’d fall–you’d slide down the totem pole. I’d hang out with Michael Jordan because he’d have an avatar for sale that would let you interact with him.

Forget living people. I’d have a whole host of Adam Smith-like extraordinary conversationalists. Dorothy Parker would be in my living room; Samuel Johnson. It would be: Why would I ever spend time with you, Paul, over Zoom and certainly not with–I won’t pick a name–but my friend here in Jerusalem who is not nearly–possibly but maybe–but could not, maybe can’t compete with my online friends. In which case I am living a total, digital life.

Is that appealing to you?

Paul Bloom: Wow[?]. There’s a lot there. Sometimes people are given this question and maybe for dating services and so on, if you could have three people over for dinner, who would they be? And, no, I don’t know. Maybe for you it would be Adam Smith, Dorothy Parker, and your late father, and you have a great conversation and then maybe another night you choose another three–sports or current economists. And, well, now we can. In the near future, in our lifetimes, maybe we can.

And, part of the seductive part of this, and part of why it’s worrying is any individual in reality is at times sleepy, impatient, rude, self-centered, uninteresting. Conversations don’t always go the way you wanted. Maybe I really want your advice on something, I want to tell you something, but maybe you’re bored, or maybe you want to tell me something and you one-up me on my story. I didn’t want that. I wanted sympathy[?].

But of course, the AI will be just right. And, it could be–there’s an analogy here with pornography. There’s an analogy here with super-sweet foods. That, our minds have evolved to have certain tastes–evolved through evolution, through culture–to have certain tastes. And, we have tastes in people. We’re looking for kindness and love and patience and humor.

And, what if these simulations can do that better than real people can? Where would the draw be of real people? Putting aside the physicality, which maybe AI can’t do, but most of my relationships with people are not physical in any way. So, I’m perfectly happy just talking to them. And, maybe AI does better talking.

And, it’s easy to see this as a dystopia. You lose contact with your friend. Why would you–you gave the reasons for why would you think Adam Smith is wiser. If you could conjure up somebody who is much better? And so, in some way that’s terrible. But, it could also be the end of loneliness.

I mean, I’ll make the other argument, which is: you know, you and I are, I think, very, very fortunate that we have people who love us, and are into–varying extents–enmeshed in communities. You’re a university President, you’re more enmeshed than I–you’re more enmeshed than I’d want to be. But, there probably are not–there are not days and days that you go without human contact, without anybody interested in you talking to you.

But, there are people–people not far from either one of us right now–who haven’t spoken to another person for a long time and are desperate for human interest. And, what if AIs could scratch that itch?

People mock those who seek out, you know, AI boyfriends and AI girlfriends, but loneliness is awful. There are few psychological torments worse than lonely–and it’s possible AI could fix that. And that’s the case for it.


Russ Roberts: You write the following. We’ve touched on this, but I’m going to take a variant on it. “How much do we want”–this is a quote:

How much do we want the simulations to correspond to the people they simulate? A couple is married for thirty years, the husband dies from a long illness, and his widow misses him desperately. They had been taping their interactions for many years–they knew this time would come–so the simulation she later signs up for is excellent; it’s just like talking with him. Their conversations are an enormous relief. But nobody is perfect. Her husband had his flaws; while he loved her very much, he could be sharply critical, and in his later years, he was forgetful, telling the same stories over and over again. Can she contact the firm that provides the simulation and ask for a few tweaks?


And, I am reminded of a song I’ve quoted on here before. The song is “It Had to Be You.” It goes like this–I’m not going to sing it because I have a little trouble with the melody. It’s a little bit challenging in parts, so I’m just going to read it.

It had to be you, it had to be you.

I wandered around and finally found the somebody who

Could make me be true, could make me feel blue,

And even be glad just to be sad thinking of you.

And, here’s the key part. That was pretty good, though:

Some others I’ve seen might never be mean,

Might never be cross or try to be boss,

But they wouldn’t do,

For nobody else gave me a thrill.

With all your faults, I love you still.

It had to be you, wonderful you,

It had to be you.

So, who would add faults to their avatar? Who would not tweak that simulation to take out the obnoxious criticisms of a spouse, the moments of cruelty that probably the person who said them might even regret? It might be happy that the spouse takes them out when they’re gone; but they’re not human.

Russ Roberts: Really.

Paul Bloom: There’s an analogy with food, which is: we have engineered food that hits all of our buttons–sugary sodas and impossibly fatty meat–and it just lights us up. But sometimes you eat this food and afterwards you don’t feel right and you want real food. You want real food that isn’t gussied up and energized.

When you’re a kid, candy is wonderful and it’s hard to eat vegetables for many kids. They don’t want to eat vegetables. There’s none of the bang of food. But, vegetables can be terrific. And, you are making a case for a similar point with people, which is: it might be that a perfectly designed avatar of somebody I love, all the flaws removed, would be inhuman, and, you know, wouldn’t come off right. It might be that to be seen as human, to be appreciated as human, you have to re-insert some flaws. So, a little bit of repeating the same story twice, a little bit of a bite at a comment. And then, part of it, he says, ‘Yeah, this is a person.’

I wanted to go back to something you said about the widow because I found that a great story. And, in my piece, I quote a friend of mine, the developmental psychologist, Paul Harris. And, Paul has this wonderful essay on death, and how we respond to death.

And he points out that a very common picture among developmental psychologists, starting with Bowlby, a great attachment theorist, is that what happens when somebody close to you dies is first you don’t believe it, and then you respond with anger and despair and all of these emotions.

But, it turns out it’s more complicated than that. It turns out that studies with adults–with widows, actually–there’s a big study of widows, finds exactly like the story you told me. Widows very often report continuing conversations with their dead husbands, hearing their dead husbands’ voices, keeping things that he owned and he used, around them as reminders, having photographs to remind them.

And then, there was a similar study with children–children who lost their parents, children who lost their siblings, same thing. They would hear from them, they would talk to them. But, if you ask the children, ‘Do you understand that your father is really dead?’ ‘Yeah, of course.’ Only a tiny minority expressed any doubt. They fully come to grips with the fact these people are gone, and yet in their minds, they resume a relationship with them. And, in some way, then, this AI would just facilitate that. It’s like a prop to continue this, maybe. And, you can imagine it having some therapeutic uses that way.


Russ Roberts: Well, I’m pretty sure my mom still talks to my dad, and, like that story I told, and guess what? She doesn’t need the avatar.

When you’ve been married–they were married 60 or so years, almost 70 years, 69 years. She kind of had the data. She’s still of sound mind, and she doesn’t need the simulation to remember how–not just my dad’s catchphrases and favorite things to say, but she can, I’m sure as I can, have a very good conversation with him in my head.

And, similarly, my wife, thank God, is alive–we talk all the time. Without her, me and her. Because I think of things to say and I think of what she’d say back and so on. That is part of an enduring friendship or marriage.

And, I think what’s troubling about this–and I think we should talk a little bit about the abomination part, because we kind of haven’t, we’ve said it’s weird or creepy, but abomination is a very strong word. The abomination part is about the tweaking or the altering, and then the relying on–I think it’s not just, ‘It’d be kind of cool to ask a question of Adam Smith,’ or to watch a video, by the way, of someone who has advice for me. I don’t have to conjure up some crazy AI science fiction thing. I turn to dead people all the time. I read their books. It’s fine. Nobody thinks it’s weird.

The weird part, I think, is twofold. It’s the tweaking to produce what you want as opposed to the reality that was.

And then, the second part is living in that world full time. And I think that will be the challenge. Just like junk food is seductive, I think the appeal of digital friends, both romantic and sexual as well as–that’ll be much more interesting I think, than me trying to have a drink with Adam Smith. But, I think that world, that retreat from the human flesh-and-blood world is what is creepy, abomination-ish.

Paul Bloom: Yeah. I agree. I think that there’s a couple of things. One is: I don’t find anything creepy about an Adam Smith simulation. That’s all–it could be intellectually stimulating, all good fun.

But, imagine somebody whose child dies–say, a teenage child–and then there’s a simulation. He talks to the child and shares stories and talks about, ‘Oh, remember when we–.’ There’s something about that which might be repellent, above and beyond any sort of implication it has for the grieving process, and for how you spend your life. It might be repellent because this machine is purporting to be somebody who it’s not.

And, even if you have–and I discussed this a bit in the essay–even if it’s sort of very careful to say–you say ‘Remember when we–‘ and then, the simulation comes back, says, ‘I should remind you that although I have the voice of your son, I’m not really him.’

But, even if it does that, there’s something–I don’t know, unholy about a machine trying to replicate faithfully somebody you love, so that you could pretend that they are that person.

And then, I’ll also add just something sort of practical, which is: everyone has observed and is panicking about the extraordinarily addictive powers of the Internet and social media and artificial worlds we live in. Jonathan Haidt most recently has a book coming out on a topic, and this just adds to it: ‘Oh, great. Now we have the people we love accessible online.’ And, all of that makes it less likely you’ll see your friend or Zoom with your real friend, and more likely you’ll just press a button and get it through the computer, and get something better.

Russ Roberts: Yeah, I hadn’t thought about this, but of course it’s obvious that it’s going to change. We’ll be in competition with these creatures, and it will change how we interact.

And, only the most extraordinary, perhaps–maybe only the most extraordinary people–will have real friends. And, the less attractive, less charismatic people will be driven to a fully online existence. That’s painful. But, I want to come back to your word–you want to comment on that?

Paul Bloom: Yeah. I think here’s another way of putting your point, which is–it’s an issue which sometimes comes up. I have an essay with some friends of mine at Toronto on empathic therapy done through AI. And, one problem with it is that just like a child who eats a lot of candy and drinks a lot of sugary soda, and then won’t go near real food, you could imagine a case where people become used to these perfectly compliant, frictionless, incredibly interested in you, incredibly witty AI simulations.

And so, real people just are not–they don’t match up. Why would I want a real girlfriend when my AI girlfriend is so much–is so interested in me, loves me so much, and has no needs of her own apparently, above just cares all about me? And, what would that do to people?

Russ Roberts: Yeah. I’m 69, Paul. I forget how old you are.

Russ Roberts: So, our 23-year-old listeners, of which there are a few–maybe more than a few–may find this puzzling, this conversation. They’re more used to technological comfort than we are.

I think there would be a big age gap in what is considered abominable and what is considered a blessing.


Russ Roberts: But, I want to come back to that word ‘unholy,’ because I think that gets at something you touch on only obliquely in the essay, which is religion.

So, religion believes–most religions, I think, certainly, of the Judeao-Christian ones–believe that you have a soul. There’s something divine about your essence, and when you die, something happens to that soul. You don’t just decompose as a physical object. You’re different from a dog, and you’re different from a table.

And, you have–humanity is maybe crooked timber, but it has a spark of the divine. Certainly in Judaism, and I won’t speak for other religions. But, most religions deal with some kind of afterlife, some kind of hope for reuniting, and so on.

And that worldview has diminished in the West: it has become less appealing. And, we had a conversation a long time ago, which got into my book–ended up in my book, Wild Problems–about whether it’s better to be a philosopher or a pig. Whether it’s better to live the life examined, the examined life of the philosopher, or to be cavorting, and enjoying a physical life.

And, as religion diminishes, I think it’s harder and harder to reclaim anything other than utilitarian physical pleasure.

And so, you and I, we’re older. We come from a different era. We still have in us some unease about some of these scenarios. I think younger people, particularly secular young people, would find some of our unease both baffling and perhaps silly. Life, if you’re not religious, life’s to be enjoyed, and why wouldn’t you spend it with the best possible experiences with those avatars?

You know, it goes back to this wonderful idea of Robert Nozick’s, The Experience Machine, where you hook yourself up to a machine, you program it, and while you’re on the machine, you will think you are the greatest golfer of all time, the President of the United States, the doctor who cures cancer, the rock star who plays before 100,000 people, and so on. You choose whatever you want. And, while you’re hooked up the machine, it’ll feel like real. And then, you die when you finish your life on the machine; and you accomplish nothing.

And, for those of us–myself, for example–who feel that life has some kind of purpose and that we have things to achieve and growth to experience, these digital alternatives are abominable and creepy.

But, I think for most people, I don’t think it bothers them at all.

And, when Nozick wrote that Experience Machine example, which was back in the 1970s, I think most people would have been horrified by it. And–I’m sure there’s data on this; people have asked about this–and I know that in the modern world, meaning now, many more people are willing to live that, do that. They say would. Even though they would do nothing with their life other than lay on a table but feel like they were doing something.

And, I think your examples get at that. To interact with a machine–a digital avatar–day-to-day, instead of real human beings might be very pleasant. It’s like the pig. But it’s not the philosopher’s life.

Paul Bloom: I never thought of it that way. I’ve never connected the idea that being immersed with sort artificial friends is like being a Nozick’s Experience Machine, where you have interactions that ultimately lead to nothing. They’re sort of all in your head. They’re in your head: now your head is supplemented by a machine, but it’s still all in your head. You’re not really making a connection, you’re not really establishing a relationship, you’re not really changing people’s lives.

And, I think that’s a really clever way to put it, and maybe helps us figure out what we might find so disturbing about it. It’s an escape from reality.

Sure, maybe you’re happier, maybe it’s more [?], maybe it alleviates your loneliness better. But in reality, the person you love is dead, and you should be seeking out after a certain amount of time, when the grieving has ended, more people.

In reality, yeah, your AI simulation is fantastic company, but your friend is a flesh-and-blood person; and connecting to an actual person means so much more. [More to come, 41:50]