Death and AI in an Arizona Court
The use of AI at a sentencing hearing probably would have gone down differently in China.
A screenshot of the video, via NPR
When you kill a man, you don’t usually get to hear him forgive you for it. That is, unless you’re Gabriel Paul Horcasitas, who was convicted of manslaughter for shooting Christopher Pelkey.
At the sentencing hearing in Arizona, a version of Pelkey addressed the court in a video address. His brother in law had used AI tools to revive his likeness. Pelkey’s sister composed a script that the AI version of her late brother performed. “In another life, we probably could have been friends,” the AI rendering of Pelkey said gracefully to Horcasitas. “I believe in forgiveness and in God who forgives. I always have and I still do," the video said.
The case marks the first time AI was used this way in court. It also follows a wave of “grief tech” products released pretty much as soon as generative AI capabilities were good enough to imitate a deceased loved one (I wrote about that industry in the US and China in 2023).
Two years and hundreds of AI news cycles later, the US is still lacking significant national AI regulation. Plus, Trump’s “big, beautiful” tax bill, which passed early this morning, includes a rule proposed by House Republicans that effectively bans states from regulating AI for ten years.
That makes AI’s novel incursions into US institutions even riskier. The judge in Pelkey’s hearing said he “loved that AI.” He’s entitled to his personal opinion about the use of such tech in his courtroom, particularly since there is no law or regulation to go on. But as new uses of AI like this become de facto precedent, there’s not only lacking guidance from government; it looks like the government isn’t even paying attention.
Lawmakers seem uninterested in making statements, let alone laws, about AI in specific use cases; when they do talk about the technology, it tends to be in broad terms that are either excessively optimistic or infused with doomsday hysteria. That, by the way, is exactly what the tech companies want—because it paralyzes any would-be regulatory efforts.
Meanwhile in China, the cyberspace regulator last month launched a campaign to address abuses of AI. One of the listed improper uses concerns using AI to “resurrect the dead” or abuse the information of the dead. While the campaign isn’t binding law, the shout-out is particularly notable given China’s on the books regulations of AI and, specifically, deepfakes. The deepfake regulations, released in 2022, prohibit providers from making AI-generated clones of individuals without their consent.
We can only assume Pelkey didn’t have a chance to give written consent about the use of his likeness to create an AI-powered apology video directed at his killer. Of course, it’s entirely possible his sister’s assessment of what he would want is accurate. But the point is we have no way of knowing.
The video of Pelkey isn’t the same type of grief tech that companies are releasing to mimic deceased loved ones. Those products use generative tools to have conversations with their interlocutors “on their own,” while Pelkey’s video was based on a specific script written by his sister.
Since people are not—yet—noting in their wills how or if they want AI to be used to represent their living selves, it’s a little difficult to determine which version is worse: a convincing version of “you” following a tight script written by someone else, or an even more convincingly “autonomous” version of you deciding what to say based on as much video and audio data the customer has on hand. The latter is a kind of bespoke autocomplete; the former is akin to a virtual puppet.
No country is striking the perfect balance on how to manage the coexistence of death and AI. But China’s regulators are clearly at least considering the implications of high numbers of individuals being digitally rendered in afterlives they may have had very strong feelings about, without their consent.
It’s worth watching how death will figure into China’s forthcoming AI rules—and if you’re in the US, it’s probably time to think about how you would or would not like to be “brought back.” Even in death, your data might continue to enrich AI firms, and simultaneously, provide limited relief to the people you leave behind.