Primary Topic
This episode explores the intersection of artificial intelligence (AI) and memorialization, particularly through creating digital avatars of oneself for future interaction by loved ones.
Episode Summary
Main Takeaways
- AI technology can create digital avatars called "versonas" that memorialize individuals after death.
- These avatars can interact with loved ones, providing comfort and preserving memories.
- There are significant ethical considerations, including consent and the impact on grief.
- The technology poses a challenge to our traditional understanding of presence and absence post-death.
- Users and creators must navigate the balance between beneficial uses and potential emotional consequences.
Episode Chapters
1: Introduction to AI Immortality
Explores the concept of using AI to create digital avatars for posthumous interaction. Focuses on the story of a couple who created avatars as a precaution for their children.
- Mary Louise Kelly: "AI has opened the door for us to live on after we die."
2: Ethical Considerations
Discusses the ethical implications of AI memorialization and the need for explicit consent from those who become data donors.
- Katajana Novajik Baszinska: "It's crucial to seek explicit consent from the so-called data donor."
3: Personal Stories
Shares personal stories of individuals who opted to create digital versions of themselves, highlighting their motivations and the process involved.
- Michael Bommer: "I see my AI as an intelligent digital memoir."
4: Technological and Societal Impact
Examines how such technologies could affect societal norms around grief and memory.
- Annette: "For me, it is a machine. It isn't human."
Actionable Advice
- Consider the emotional impact: Before deciding to create a digital avatar, consider how it might affect your loved ones emotionally.
- Understand consent: Ensure that all parties involved understand and consent to the use of their data for creating a digital persona.
- Set clear expectations: Communicate clearly what the digital avatar can and cannot do to avoid unrealistic expectations.
- Secure data privacy: Ensure that the data used to create the avatar is secure and handled with respect to privacy concerns.
- Seek professional guidance: Consult with ethicists and professionals in the technology field to understand the implications of creating a digital legacy.
About This Episode
Michael Bommer likely only has a few weeks left to live. A couple years ago, he was diagnosed with terminal colon cancer.
Then, an opportunity arose to build an interactive artificial intelligence version of himself through a friend's company, Eternos.Life, so his wife, Anett, can interact with him after he dies.
More and more people are turning to artificial intelligence to create digital memorials of themselves.
Meanwhile Katarzyna Nowaczyk-Basińska, a research associate at the University of Cambridge, has been studying the field of "digital death" for nearly a decade, and says using artificial intelligence after death is one big "techno-cultural experiment" because we don't yet know how people will respond to it.
Artificial intelligence has opened the door for us to "live on" after we die. Just because we can, should we?
People
Mary Louise Kelly, Jason Gowan, Melissa Gowan, Michael Bommer, Annette
Companies
You Only Virtual, Eternos
Content Warnings:
None
Transcript
Mary Louise Kelly
Jason and Melissa Gowan didn't spend much time thinking about death until the couple faced serious health scares back to back. And then all of a sudden, we were pretty mortal, and we were very concerned about what was going to happen to our kids. That fear led them to an AI company called you only virtual. It creates AI chatbots modeled after deceased loved ones. Jason and Melissa became two of their first users so that their young sons would have a tool for memorializing them.
When they died, they uploaded audio and video of themselves to the company's cloud and watched their digital avatars come to life. Jason, who's a comedian, says his is pretty spot on. He like, makes little jokes and he like, references funny stories and things that happen to us. So it's, it's such a surreal thing, and it's also, it like, lifts my heart to see my son react to it. Both parents health conditions have stabilized, but their comfort at knowing that their AI likenesses, known as versonas, are ready to go.
Melissa Gowan
We built these versonas as a fail safe just in case, so that on big days like wedding days, or college graduation or high school graduation, or on a day you just need a pick me up from, from a parent who passed, we have that there, just in case you only. Virtual is one of a number of companies at the intersection of AI and memorialization. Users like Jason and Melissa Gowan are finding peace of mind in this space. But many tech ethicists are concerned. For some people, using the simulation of a deceased loved one might be comforting and helpful during this very difficult time, while for others it might be emotionally draining or even devastating.
Mary Louise Kelly
Katajana Novajik Baszinska is a research assistant at the University of Cambridge. She's been studying the field of digital death for nearly a decade, and says the whole thing is one big techno cultural experiment because we dont know yet how people will respond to it. In a recent paper, she and her colleagues outlined some of the potential red flags this technology could raise, like how it will impact the grieving process, who owns the data, how its used. And then theres this whole question about consent. It will be crucial to seek this explicit consent whenever possible from the so called data donor, so the person whose data is used to create the griefbot, ideally before the death of that person.
Baszinska says as immortality AI inevitably grows, her priority is to keep users safe to protect their digital rights. We have two options here. We can either work towards a future where we make the most of this tool by promoting values such as diversity, empathy, care, trust, and respect. Or we can remain as we are where the most important value is profit. Consider this.
Artificial intelligence has opened the door for us to live on after we die just because we can. Should we?
From NPR, I'm Mary Louise Kelly. This message comes from NPR sponsor the Capital one venture card. Earn unlimited two x miles on every purchase. Plus, earn unlimited five x miles on hotels and rental cars booked through Capital one travel. Whats in your wallet terms?
Unidentified Sponsor
Apply see Capital one.com for details. This message comes from NPR sponsor betterhelp. With the year halfway over, therapy can help you take stock of your progress and set achievable goals for the next six months. If youre thinking about trying therapy, give betterhelp a try. Its entirely online, designed to be convenient, flexible, and suited to your schedule.
Visit betterhelp.com NPR today to get 10% off your first month. This message comes from NPR sponsor Noom Nooms first ever cookbook. The Noom kitchen helps you build new habits for a healthier lifestyle. Check out the noom kitchen for a hundred healthy and delicious recipes to promote better living. Available to buy now wherever books are sold.
Mary Louise Kelly
Its consider this from NPR. Michael Ballmer likely only has a few weeks left to live. A couple of years ago, he was diagnosed with colon cancer. The doctors told him it was terminal. And then an opportunity arose to build an interactive AI version of himself through a friends company called Eternos.
Michael and Annette, who are based in Berlin, took a little time away from their day to talk with us. And I asked Michael to tell me about the moment he decided to create an AI version of himself. Like a year ago, I sat with my wife in one of these more teary eyed exercises, talking about what comes. And my wife said, hey, one of the things I will miss most is being able to come to you, ask you a question, and you will sit there and calmly explained the word to me. And then I posted Facebook to all my friends, hey, guys, it's time to say goodbye.
Michael Bommer
And Rob called me after that and said, hey, we all thought you might make it through, but hey, here's a gift. Why don't we do this together? And I was immediately, yes, because I already had the thought myself to do something with voice synthesizing. But now adding AI to that was a great thing for me. Annette, I do have a question for you.
Mary Louise Kelly
When Michael first told you about this idea, what did you think? Well, I thought, well, yeah, let's do it. Really? He has a lot of projects in our life. And at this moment, it was a little bit silent in our daily routine.
Michael Bommer
And. I thought, wow, that makes this part in his life a little bit better. Filling out this to dos. So, Michael, tell me, how did it work to build it, to program it? I understand it's AI.
Mary Louise Kelly
So, like you said, it has access to all kinds of knowledge and information that you don't have, and it will keep learning. But the things it sounds like Annette wants to ask you are things only you would know. How do you program? Yeah. So there's two steps to that.
Michael Bommer
First, you need to give it my voice. And this happens with 300 sentences you record. And out of these 300 sentences, these are specific sentences. You create the voice of all the nuances of a voice. And the second part is that you fill it with content.
Now, in my case, because we were so short on time, I simply told 150 stories about my life, early life, mid life, late life. What I would recommend back to me as a young person, what would I recommend to my children, my grandchildren? So to give it all the content around life and living all the content about my history, and that's the content where the AI is created. Normally, this will take weeks and months. Right.
In my case, we needed to put it into more or less mere days. And out of that, you create the AI. Now, when the AI now wants to answer a question, the question goes into. You can imagine it like a cloud. And in the cloud is all the knowledge which I left for the AI.
And he picks parts of the things I talked which fit for the answer and put them together into a strain into an answer. Now, sometimes there is something which knowledge base where the eye can take knowledge from the Internet and ask the question to the Internet and say, last time it was, the car was making noises. And so they went out in the Internet and said, so what causes these noises in cars? And he took this mainstream answer back. Are you sharing with it then, things that you want to make sure Annette knows, like when the car starts making a weird noise and you're not around, she can ask and have an answer of, well, last time it was this.
Mary Louise Kelly
Why don't you check? Yeah, so I didn't do that in that, Dev, what I did more is try to convey my principles, my principle in life, so to speak. Always de escalating. As I say, it's created your home safe. Say, hey, I'm sure our auto mechanic can help you.
Michael Bommer
So, reinforcing. You're good, right? And then, hey, by the way, these noises could come from this and this and that. So the principles which I gave deescalating, stay calm, reflect whatever which is my nature. Right?
That's in this AI hanobs, I'm listening to you. I'm thinking of something I read, which was a reference to this kind of AI as immortality. Is immortality a part of it? I mean, because there's a piece of you, an essence of you, that will carry on and carry on interacting with people you love. No, that's.
I see my AI as intelligent digital memoir. And so if you write your memoir, that's not eternal life. So I see it more as a tool. I want to give my knowledge and experience, and then I'm gone. I'm gone and I'm gone.
And I want the next generations to inherit my experience and my knowledge as much as possible. Annette, have you talked to it yet? Only for testing. For testing, yeah. And in this moment, I love the time with him.
Mary Louise Kelly
You're talking to the real Michael while you can. Yeah. Annette, do you think it will really feel like him? Like your husband? No.
Annette
For me it is a machine. It is a machine, exactly. Not warm, not touching, it isn't human. Is there any part of this that frightens you, worries you? No, I'm not afraid about this one.
It is, for me, a tool. And should I get afraid, I can close this tool and don't use it. So therefore, in this moment, I'm happy to have it and to try. And when I fail, then I fail. But I'm not afraid about it.
Michael Bommer
I'm leaving it behind. Right. If it's used or not, if they hang it as a picture, like a picture of me at the wall or they put it in a drawer, I don't care. I cannot influence that. But I can leave it.
I can leave it behind. I like that way of thinking about it. It's like a picture of you or a painting, which feels very normal to leave behind for people who will be grieving. Yeah. What type questions can you imagine asking it, Annette?
Annette
I assume perhaps to read me a poem. Or I could ask him when he met us or when we get married, I can ask him. Okay, tell me about me. He proposed, so a little bit. So remembering together all the nice things he had.
Michael Bommer
If people have a problem with grief, and this is independent of an AI or not AI, it is a problem with grief. And unfortunately, our society is treating grief very badly because we distance ourselves from people out of a good meaning. But it's not good, because if you're grieving, you need to grieve openly and you need to be open to embrace people who are grieving. But people are often isolated in their grief, and then they turn to whatever kind of remembrance they can to try to realize what they had. And that's bad.
Mary Louise Kelly
Yeah. In terms of technology you use for that. Well, I want to thank you both for being so open and talking this through with us. I'm sorry for everything you are dealing with, and I appreciate you're sharing it with us. Thank you.
Michael Bommer
Thank you very much for taking us. Thank you. That is Michael Balmer and his wife Annette speaking with us from Berlin. Thank you. Bye bye.
Annette
Bye bye. This episode was produced by Katherine Fink. It was edited by Courtney Dorning. Our executive producer is Sammy Yenigun. And one more thing before we go.
Mary Louise Kelly
You can now enjoy the consider this newsletter. We still help you break down a major story of the day. And you'll also get to know our producers and hosts and have some moments of joy from the ALL Things considered team. You can sign up@npr.org.org slash consider this newsletter.
It's Consider this from NPR. I'm Mary Louise Kelly. Last year, over 20,000 people joined the body electric study to change their sedentary, screen filled lives. And guess what? We saw amazing effects.
Unidentified Sponsor
Now you can try NPR's body electric. Challenge yourself. Listen to updated and new episodes wherever you get your podcasts. This message comes from NPR sponsor Rosetta Stone, an expert in language learning for 30 years. Right now, NPR listeners can get Rosetta Stone's lifetime membership to 25 different languages for 50% off.
Learn more@rosettastone.com NPR this message comes from. NPR sponsor Mint Mobile. From the gas pump to the grocery store, inflation is everywhere. So Mint Mobile is offering premium wireless starting at just $15 a month. To get your new phone plan for just $15, go to mintmobile.com.
switch.