There's been a lot of controversy over AI deep fake videos and digital necromancy of dead or aged actors (mostly in Star Wars), but how about taking it a step further? The company DeepBrain AI will collect audio and visual recordings of a loved one and then create an interactive AI version of them to talk to you after they're gone.
A recent BBC story highlights some of the new players in “death tech,” or the space of technology startups that offer services related to bereavement and death. Two of the companies profiled are, dare I say it, normal. You can use HeafterAI to collect recordings and photos and save them for your loved ones, but the service doesn't seem to generate new text or audio—the “AI” part is easy to create, store, and share. It has an interactive user interface. and accessing that data. Settld, meanwhile, offers a service to cancel a deceased loved one's financial and social accounts, removing a major logistical burden from the bereavement process.
Ho boy, third though. Michael Jung, CFO of DeepBrain AI, claims that the company's AI matches are “96.5% identical to the real person, so most families don't feel uncomfortable talking to a deceased family member.” I'm dying to know how you measure the deep and subjective experience of talking to a loved one to a fraction of a percent.
Seeing a video of this product in action from 2022 made me feel crazy. “The husband Mr. Lee, who was sentenced to death, was worried about his wife who would be left alone, so he decided to leave his digital twin for her.” It's a hell of a start, but the benefit of the doubt suggests that “sentenced to death” was a poor translation for a figurative “death sentence” like a terminal illness — I hope. The phrase “digital twin” makes my skin crawl: it implies a kind of equivalence between a real person and a born homunculus, a kind of techno-topia that should be exterminated with extreme prejudice.
The footage of Avatar in action is also weird. Watching the two conversations on offer feels deeply acoustic and awkward, but they're also minimally interactive—absent context, I would have assumed they were simply pre-recorded messages, which I I question the feasibility of the product. As a lover of games, I miss the pre-released “bullet shot” trailers from E3s past that purported to show actual gameplay.
Even assuming it works as advertised, I find the concept extremely disturbing. After all, the AI product is an organizational tool, which in my view is no more controversial than keeping a loved one's letters or recordings the old fashioned way. DeepBrain AI threatens something else: a slur of recorded “content” from a loved one in the shape of a crooked puppet to manipulate the bereaved to the tune of $50,000.