A man slain in 2021 “speaks” at his killer’s punishment court thanks to AI video

A man slain in 2021 speaks at his killer's punishment court thanks to AI video

A man killed in a road rage incident in Arizona “spoke” at his killer’s sentencing hearing after his family used artificial intelligence to create a video of him reading a victim impact statement.

In what is believed to be a first in US courts, Chris Pelkey’s family used artificial intelligence to give him a voice. Pelkey was shot by Gabriel Paul Horcasitas on November 13, 2021, while both vehicles were stopped at a red light. According to records, Pelkey was shot after exiting his truck and approaching Horcasitas’ car.

During the sentencing hearing last week, the AI rendering of Pelkey told the shooter that it was unfortunate that they had to meet under those circumstances — and that the two of them probably could have been friends in another life.

“I believe in forgiveness and in God, who forgives.” “I always have and still do,” Pelkey’s avatar told Gabriel Paul Horcasitas.

The AI version of Pelkey went on to advise people to make the most of each day and to love one another, regardless of how much time they had left.

While the use of artificial intelligence in the court system is increasing, it is typically limited to administrative tasks, legal research, and case preparation. In Arizona, it has aided in informing the public about important court decisions.

However, using AI to generate victim impact statements represents a new — and legal, at least in Arizona — method of sharing information with the court outside of the evidentiary phases.

Judge Todd Lang of Maricopa County Superior Court, who presided over the case, told CBS News partner BBC News that he “loved that AI.” Lang also noted that the video mentioned Pelkey’s family, who had expressed their outrage over his death and asked for Horcasitas to receive the maximum sentence. Family and friends also sent nearly 50 letters to the judge, echoing the video’s message.

Horcasitas, 54, was convicted of manslaughter and sentenced to 10.5 years behind bars.

Horcasitas’ lawyer, Jason Lamm, told The Associated Press that they filed an appeal notice for his sentence within hours of the hearing. Lamm believes the appellate court will consider whether the judge improperly relied on the AI video when imposing the sentence.

Pelkey’s sister, Stacey Wales, suggested her brother speak for himself. Wales said she had been thinking about what she would say at the sentencing hearing for years as the case progressed through the legal system. She struggled to put words on paper.

But as she considered what her brother would say to the shooter, knowing he would have forgiven him, the words flowed out of her.

“We approached this with ethics and morals because this is a powerful tool,” she told the British broadcasting corporation. “Just like a hammer can be used to break a window or rip down a wall, it can also be used as a tool to build a house and that’s how we used this technology.”

Pelkey was born in Poughkeepsie, New York, and later resided in Arizona, according to his obituary. He was a veteran who served three tours in Iraq and Afghanistan, was active in his local church, and participated in local and international mission trips, according to the obituary.

AI in the courtroom

Victims in Arizona can submit impact statements in any digital format, according to Jessica Gattuso, a victims’ rights attorney who represented the family.

Chief Justice Ann Timmer of the Arizona Supreme Court did not specifically address the road rage case during an interview Wednesday. However, she stated that the increased popularity and accessibility of AI in recent years prompted the formation of a committee to research best practices in the courts.

Gary Marchant, a committee member and Arizona State University law professor, said he understands why Pelkey’s family did what they did. But he warned that the use of this technology could lead to more people attempting to introduce AI-generated evidence into courtrooms.

“There’s a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used,” according to him. “It’s easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see.”

Marchant cited a recent case in New York in which a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit over video. It only took seconds for the judges to realize that the man addressing them on the video screen did not exist at all.

Source