By: Gaby Hinsliff / The Guardian
Translation: Telegrafi.com

Joaquin Oliver was 17 when he was shot in the hallway of his high school. An older teenager, expelled a few months earlier, had opened fire with a high-powered rifle on Valentine's Day, in what became deadliest high school massacre in America. Seven years later, Joaquin says he thinks it's important to talk about what happened that day in Parkland, Florida, "so that we can create a safer future for everyone."


But, unfortunately, what happened to Joaquin that day is that he died. The strange metallic voice that spoke to former CNN journalist Jim Acosta, in an interview on [platform] Substacks this week, it was actually a digital ghost: an artificial intelligence [AI] trained on his old social media posts at the request of his parents who are using him to support their campaign for stricter gun control. Like many grieving families, they have told their child’s story over and over again, with much pain and little success. It’s no wonder they are now reaching for every tool possible, trying to figure out what needs to be done to make the dead children heard in Washington.

But also, as his father, Manuel, admits, they simply wanted to hear their son's voice again. His wife, Patricia, spends hours asking AI questions, hearing him say, "I love you, Mom."

No parent in their right mind would judge a grieving parent. If it is comforting to keep a lost child's room a sanctuary, to talk to their grave, to sleep in a T-shirt that still smells of them, then that is no one else's business. People cling to anything that can maintain a connection. After September 11 [2001], families listened to the phone messages, until the recording ran out, left by loved ones calling to say goodbye from burning towers and hijacked planes. I have a friend who regularly re-reads old conversations on WhatsApp with her deceased sister, and another who occasionally texts her deceased father's number with snippets of family news: of course, she knows he's gone, but she's not ready to end the conversation yet. Some people even pay fortune tellers to communicate with the dead, in dubious and ambiguous words. But precisely because it's so hard to let go, grief is susceptible to exploitation. And there could soon be a huge industry for the digital return of the dead.

As with me the sensitive video created by AI which Rod Stewart brought up on stage this week, featuring the late Ozzy Osbourne paying tribute to other deceased music figures, this may be nothing more than a celebratory meme. Or it may have a temporary purpose, like AI avatar recently created by a victim's family of the Arizona shooting, to address the judge at the perpetrator's sentencing hearing. But over time, this could become something that profoundly challenges ideas about identity and mortality. What if it were possible to create a permanent copy of someone who has died, perhaps in robot form, and continue a conversation with them forever?

Resurrection is a divine power, not something to be lightly handed over to some technologist [tech bro] with a messianic complex. But, while the legal rights of the living to avoid identity theft - for use in fake creations [deepfake] created by AI - are becoming increasingly clear, the rights of the dead are unclear.

Reputations die with us—the dead cannot be defamed—while DNA is protected even after death. (The birth of Doli the sheep in 1996, a genetic clone created from a single cell, prompted global bans on human cloning.) The law regulates the dignified disposal of human tissue, but the AI ​​will not be trained on bodies: it will be trained on private voices, messages, and photographs that reflect what mattered to a person. When my father died, I never personally felt that he was really in the coffin. He was very much present in the boxes of his old letters, in the garden he had planted, in the recordings of his voice. But everyone grieves in different ways. What happens if half the family wants to resurrect their mother digitally, while the other half doesn’t want to live with ghosts?

The fact that Joaquin Oliver’s AI can never grow up — that he will remain eternally 17, frozen in a teenage persona on social media — is ultimately the fault of his killer, not his family. Manuel Oliver says he knows full well that the avatar isn’t really his son, and that he’s not trying to bring him back. To him, it seems more like a natural extension of the way the family’s campaign has always evoked Joaquin’s story. However, there’s something unsettling about the plan to give the AI ​​access to a social media account, upload videos, and gain followers. What if it starts hallucinating or veers into topics where it can never know what the real Joaquin would have thought?

While AI avatars currently have distinct performance issues, as the technology improves, they could become increasingly difficult to distinguish from real people online. It may not be long before companies and even government agencies — which already use chatbots to handle customer inquiries — start considering whether they could use public relations avatars to answer journalists’ questions. Acosta, a former White House correspondent, probably should have known better than to further muddy the already muddy waters of a post-truth world by agreeing to interview someone who technically doesn’t exist. But for now, perhaps the most obvious danger is that conspiracy theorists will use this interview as “proof” that any story that challenges their beliefs could be a hoax — the same crazy lie that the site’s owner spread. info wars, Alex Jones, about the Sandy Hook school shooting.

However, the professional challenges involved here are not just for journalists. As AI develops, we will all increasingly live with synthetic versions of ourselves. It will no longer be just Alexa relatively primitive in your kitchen or the chatbot on your laptop - although there are already stories of people that give AI human traits or even fall into love with ChatGPT-In - but something much more sensitive to human emotions. When one in ten British adults tells researchers that has no close friends, there will certainly be a market for AI companions, just as there is today for securing a cat or navigating the lives of strangers in TikTok.

Perhaps, as a society, we will eventually decide that we are comfortable with technology that meets people's needs while others, unfortunately, were unable to. But there is a big difference between creating a general comforting presence for the lonely and waking up the dead on demand, of someone you once loved. There is a time to be born and a time to die - as the verse often read at funerals says. How will this change us as a species, when we are no longer sure which is which? /Telegraph/