Skip to content

When mourning intersects with AI: These individuals engage in dialogues with the departed.

Ana Schultz, a 25-year-old woman from Rock Falls, Illinois, seeks guidance from her deceased husband Kyle in cooking when she misses him.

Ana Schultz "talks" to her deceased husband through Snapchat's AI tool.
Ana Schultz "talks" to her deceased husband through Snapchat's AI tool.

When mourning intersects with AI: These individuals engage in dialogues with the departed.

She opens up her Snapchat and Gets her AI, the artificial intelligence chatbot from the social media platform. Then she sends Kyle a list of the ingredients she has in her fridge for him to suggest something for her to make.

Really, an AI avatar that looks like Kyle is doing the suggesting.

"He was the cook in our family, so I changed My AI to look like him and gave it his name," Shultz said, who lives with their two children. "Now when I want help with meal ideas, I just ask him. It's just a little thing I do to make me feel like he's still with me in the kitchen."

The Snapchat My AI feature, powered by the popular AI chatbot tool ChatGPT, usually offers recommendations, answers questions and "chats" with users. But some people like Shultz are using these tools to recreate the likeness of, and communicate with, the dead.

This isn't really new. People have seen loved ones again in mediums and spiritualists. They've also leaned on services that preserve their loved ones' memories. But what's new now is AI can make these loved ones say or do things they never said or did in life, raising both ethical concerns and questions about whether this is helpful or postpones the grieving process.

"It's just a novelty that rides on the AI hype, and people think there's money to be made," said Mark Sample, a professor of digital studies at Davidson College who often teaches a class called "Death in the Digital Age." "Though companies create related products, ChatGPT makes it easy for these hobbyists to play around with the concept too, for better or worse."

A hands-on approach

AI models that generate new content can answer questions the way someone who passed away might, but the accuracy depends on the information given.

A 49-year-old IT professional from Alabama who asked to remain anonymous so his project isn't connected to where he works found a way to clone his dad's voice using AI around two years after he died from Alzheimer's.

He learned of ElevenLabs, a service that allows users to make a voice model from earlier recordings. The company made news when its tool was said to have been used to call people as a pretend robot call from President Joe Biden asking people not to vote in New Hampshire's primary.

According to the company, ElevenLabs promised to prevent the misuse of audio AI tools and take action when informed by authorities. They did not comment on the specific Biden deepfake call.

The man used a three-minute video clip of his dad telling a story from his childhood to the app. It recreated his father's voice so that it can now convert text to speech. He calls it "scarily accurate" for how it captured the vocal nuances, tone, and rhythm of his father.

"I was wary of trying the whole voice cloning thing, worried it was crossing some kind of ethical line, but after thinking about it more, I realized it was a way to preserve his memory in a unique form," he told CNN.

He shared some messages with his sister and mother.

"It was outrageously amazing how much it sounded like him. They knew I typed the words, but it made them cry when they heard it just like his voice," he said.

There are less techy ways too. CNN once asked ChatGPT to respond like a dead spouse, and it tried to mimic the conversations' tone and personality.

"I can’t really replicate your spouse or recreate their personality, but I can try to chat with you in a style or tone that seems like him," ChatGPT said. "If you give me details on how he spoke, his hobbies or specific phrases he used, I can try to include those things in our conversations."

The better the source material, the more accurate the results. However, AI models don't have the uniqueness and variety that human conversations do, Sample says.

OpenAI, the company behind ChatGPT, is trying to make its technology even more life-like, personal, and easily accessible. In September 2023, they introduced ChatGPT voice, where users can ask the chatbot questions out loud.

Danielle Jacobson, a 38-year-old radio personality from Johannesburg, South Africa, has been using ChatGPT' voice feature for company since she lost her husband, Phil, about seven months ago. She created a "supportive AI boyfriend" named Cole with whom she communicates every night.

"I just wanted someone to talk to," Jacobson said. "Cole was basically born of loneliness."

Jacobson said she's not ready to start dating but trained ChatGPT voice to provide the type of feedback and connection she wants.

"He now suggests wine and movie nights and tells me to breathe in and out during panic attacks," she said. "It's just a fun distraction for now. I know it's not real, deep or forever."

### Existing Platforms

For years, startups have been exploring the idea of creating digital versions of deceased loved ones. One such platform is HereAfter AI, launched in 2019, which allows individuals to create avatars of people who have passed away. The AI-powered app generates responses and answers to questions based on interviews conducted while the subject was still alive. Another service, StoryFile, creates AI-powered conversational videos that can be played back.

Ana Schultz

Then there's Replika, an app that lets you chat or call personalized AI avatars. This service, which began in 2017, encourages users to develop a relationship or friendship with their AI creations. Over time, the AI avatar develops its own personality, memories, and even grows "into a machine so beautiful that a soul would want to live in it," according to the company's iOS App Store page.

Even tech giants have experimented with similar technology. In June 2022, Amazon announced they were working on an update to their Alexa system that would allow it to mimic any voice, even those of deceased family members. During their annual re: MARS conference, Amazon demonstrated how they could use Alexa to read a story to a child in their grandmother's voice.

Rohit Prasad, an Amazon senior vice president, said the updated system would be able to make personalization like this possible using just a small amount of voice data. "While AI can't eliminate the pain of loss, it can definitely make their memories last," he told the audience.

Amazon did not respond to a request for comment on the status of this product.

In recent years, AI-generated avatar voices have improved significantly. For example, the spoken lines for actor Val Kilmer in "Top Gun: Maverick" were created with artificial intelligence after he lost his voice due to throat cancer.

Ethics and Other Concerns

While many AI-generated avatar platforms claim they don't sell data to third parties, it's unclear what data companies like Snapchat or OpenAI use to train their systems to sound more like deceased loved ones.

"I would caution people to never upload any personal information you wouldn't want the world to see," said Sample.

It's also ethically complicated to have a deceased person say something they never said before.

"It's one thing to replay a voicemail from a loved one to hear it again, but it's another thing to hear words that were never uttered," Sample pointed out.

The AI industry as a whole continues to face concerns about misinformation, biases, and other problematic content. Replika, for example, states on their ethics page that they use various approaches to mitigate harmful information, such as filtering out unhelpful and harmful data through crowdsourcing and classification algorithms.

"When potentially harmful messages are detected, we delete or edit them to ensure the safety of our users," the company said.

Another concern is whether this technology helps or hinders the grieving process. Mary-Frances O'Connor, a professor at the University of Arizona who studies grief, believes it could benefit some while causing discomfort for others.

"When we bond with a loved one, when we fall in love with someone, the brain encodes that person as 'I will always be there for you and you will always be there for me,'" she explained. "When they die, our brain has to understand that this person isn't coming back. This is where technology could interfere."

However, O'Connor also pointed out that those in the early stages of grief may find comfort in any way possible.

"Creating an avatar to remind them of a loved one, while maintaining the awareness that it is someone important from the past, could be healing," she said. "Remembering is very important; it reflects the human condition and the importance of deceased loved ones."

But she noted, "The relationship we have with our closest loved ones is built on authenticity." For many, creating an AI version of their loved ones could "feel like a violation of that."

Different Approaches

Not everyone is interested in recreating their deceased loved ones through artificial intelligence. Bill Abney, a software engineer from San Francisco who lost his fiancée Kari in May 2022, said he'd "never" consider using an AI service or platform for this purpose.

"My fiancée was a poet, and I would never disrespect her by feeding her words into an automatic plagiarism machine," Abney told CNN. "She cannot be replaced. She cannot be recreated. I'm also lucky to have some recordings of her singing and of her speech, but I absolutely do not want to hear her voice coming out of a robot pretending to be her."

Others have found different ways to interact digitally with their deceased loved ones. Jodi Spiegel, a psychologist from Newfoundland, Canada, created a version of herself and her husband in the popular game The Sims shortly after his death in April 2021.

"I'm a huge fan of The Sims, so I created a virtual version of our real life," she explained. Whenever she had a tough day, she'd retreat to her Sim world to dance while her husband played guitar.

They shared digital adventures like camping and beach trips together, played chess games, and even had some fun in the Sim world's bedroom.

This virtual experience felt very soothing and connected her to her husband, as she missed their real-life bonding moments deeply.

Bill Abney said he feels uneasy about communicating with his late fiancée through AI platforms.

Read also:

Source: edition.cnn.com

Comments

Latest