In a ground-breaking use of artificial intelligence, a man who was killed in a road rage shooting in Arizona three years ago “returned” to address his killer at the sentencing. Chris Pelkey, who was 37 when he was fatally shot, had his voice, image, and likeness recreated through AI, allowing him to deliver a victim’s statement at the trial.
The unique use of technology at the sentencing of Gabriel Horcasitas, the man who shot Pelkey, sparked both praise and concern. Pelkey’s family, using a combination of voice recordings, photos, and videos of him, created a video where the AI version of Chris Pelkey spoke in court. His sister, Stacey Wales, wrote the words for the AI version, focusing on her brother’s forgiving nature.
“To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI version of Pelkey said. “In another life, we probably could have been friends.” The virtual Pelkey continued, emphasizing his belief in forgiveness, even for his killer.
The technology was presented during the sentencing hearing, where Horcasitas had already been found guilty of manslaughter. The shooting occurred four years ago at a red light in Arizona, when Horcasitas shot Pelkey during a dispute. The judge, Todd Lang, sentenced Horcasitas to 10-and-a-half years in prison for manslaughter. He praised the AI statement, saying, “As angry as you are, as justifiably angry as the family is, I heard the forgiveness.”
While the use of AI in legal cases is still new, it has already found some footing in Arizona’s courts. AI is used to help make court rulings more digestible to the public, and the judge in Pelkey’s case accepted its use for sentencing, noting that it was presented without a jury. However, legal experts have mixed views on AI’s growing role in the courtroom.
Some experts, such as retired judge Paul Grimm, see AI’s future potential in the justice system, acknowledging that it could be used on a case-by-case basis. But others, like Derek Leben, a business ethics professor, worry about the implications. Leben questions whether AI can always accurately reflect the true wishes of victims and their families.
For Stacey Wales, the AI statement gave her brother the final word. She said, “We approached this with ethics and morals because this is a powerful tool. Just like a hammer can be used to break a window or build a house, we used this technology to give Chris his voice back.”
The case has sparked a wider debate on the use of AI in legal proceedings, raising questions about its future role and ethical boundaries. For now, the Pelkey family has found solace in knowing that their loved one’s voice was heard, even after his tragic death.