An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”
This isn’t a message from the victim. This is a message from his sister using his image as a way to increase the impact of her statement in court.
This is a bad thing, this is manipulating the court with a false and confusing message.
There were videos shown during the trial that Stacey said were deeply difficult to sit through. “Videos of Chris literally being blown away with a bullet through his chest, going in the street, falling backward. We saw these items over and over and over,” she said. “And we were instructed: don’t you gasp and don’t you cry and do not make a scene, because that can cause a mistrial.”
“Our goal was to make the judge cry. Our goal was to bring Chris to life and to humanize him,” she said.
If gasping at video of real events is grounds for a mistrial, then so is fabricated statements intended to emotionally manipulate the court. It’s ludicrous that this was allowed and honestly is grounds to disbar the judge. If he allows AI nonsense like this, then his courtroom can not be relied upon for fair trials.
The victim impact statement isn’t evidence in the trial. The trial has already wrapped up. The impact statement is part of sentencing, when the court is deciding what an acceptable punishment would be. The guilty verdict has already been made, so the rules surrounding things like acceptable evidence are much more lenient.
The reason she wasn’t allowed to make a scene during the trial is because the defense can argue that her outburst is tainting the jury. It’s something the jury is being forced to witness, which hasn’t gone through the proper evidence admission process. So if she makes a scene, the defense can say that the defendant isn’t being given a fair trial because inadmissible evidence was shown to the jury, and move for a mistrial.
It sounds harsh, but the prosecutor told her to be stoic because they wanted the best chance of nailing the guy. If she threw their case out the window by loudly crying in the back of the courtroom, that wouldn’t be justice.
This is just weird uninformed nonsense.
The reason that outbursts, like gasping or crying, can cause a mistrial is because they can unfairly influence a jury and so the rules of evidence do not allow them. This isn’t part of trial, the jury has already reached a verdict.
Victim impact statements are not evidence and are not governed by the rules of evidence.
It’s ludicrous that this was allowed and honestly is grounds to disbar the judge. If he allows AI nonsense like this, then his courtroom can not be relied upon for fair trials.
More nonsense.
If you were correct, and there were actual legal grounds to object to these statements then the defense attorney could have objected to them.
Here’s an actual attorney. From the article:
Jessica Gattuso, the victim’s right attorney that worked with Pelkey’s family, told 404 Media that Arizona’s laws made the AI testimony possible. “We have a victim’s bill of rights,” she said. “[Victims] have the discretion to pick what format they’d like to give the statement. So I didn’t see any issues with the AI and there was no objection. I don’t believe anyone thought there was an issue with it.”
Seems like a great way to provide the defendant with a great reason to appeal
I’d really like to hope that this is a one off boomer brained judge and the precedent set is this was as stupid an idea as it gets, but every time I think shot can’t get dumber…
boomer brained judge
Boomer here. Don’t assume we all think the same. Determining behavior from age brackets is about as effective as doing it based on Chinese astrology (but I’m a Monkey so I would say that, wouldn’t I?)
The judge’s problem is being a nitwit, not what year they were born in.
OK boomer. (LOL)
Just to be clear, they were fully transparent about it:
“Hello, just to be clear for everyone seeing this, I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile,” the stilted avatar says. “I was able to be digitally regenerated to share with you today. Here is insight into who I actually was in real life.”
However, I think the following is somewhat misleading:
The video goes back to the AI avatar. “I would like to make my own impact statement,” the avatar says.
I have mixed feelings about the whole thing. It seems that the motivation was genuine compassion from the victim’s family, and a desire to honestly represent victim to the best of their ability. But ultimately, it’s still the victim’s sister’s impact statement, not his.
Here’s what the judge had to say:
“I loved that AI, and thank you for that. As angry as you are, and as justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horcasitas could appreciate it, but so did I,” Lang said immediately before sentencing Horcasitas. “I love the beauty in what Christopher, and I call him Christopher—I always call people by their last names, it’s a formality of the court—but I feel like calling him Christopher as we’ve gotten to know him today. I feel that that was genuine, because obviously the forgiveness of Mr. Horcasitas reflects the character I heard about today. But it also says something about the family, because you told me how angry you were, and you demanded the maximum sentence. And even though that’s what you wanted, you allowed Chris to speak from his heart as you saw it. I didn’t hear him asking for the maximum sentence.”
I am concerned that it could set a precedent for misuse, though. The whole thing seems like very grey to me. I’d suggest everyone read the whole article before passing judgement.
I was able to be digitally regenerated
I would like to make my own impact statement
you allowed Chris to speak from his heart as you saw it. I didn’t hear him asking for the maximum sentence.
These, especially the second, cross the line imo. The judge acknowledges it’s AI but is acting like it isn’t, and same for the sister especially.
Your emotions don’t always line up with “what you know” this is why evidence rules exist in court. Humans don’t work that way. This is why there can be mistrials if specific kinds of evidence is revealed to the jury that shouldn’t have been shown.
Digital reenactments shouldn’t be allowed, even with disclaimers to the court. It is fiction and has no place here.
why evidence rules exist in court.
Sure, but not for victim impact statements. Hearsay, speculation, etc. have always been fair game for victim impact statements, and victim statements aren’t even under oath. Plus the other side isn’t allowed to cross examine them. It’s not evidence, and it’s not “testimony” in a formal sense (because it’s not under oath or under penalty of perjury).
Victim statements to the court are always emotionally manipulative. It’s akin to playing a video of home movies of the deceased, and obviously the judge understands that it is a fictitious creation.
No, this is exactly why it shouldn’t be allowed. This isn’t akin to playing a video of home movies because this is a fake video of the victim. This is complete fiction and people thinking it’s the same thing is what makes it wrong.
It is like a home movie in that it is an attempt to humanize the victim. There is no evidence in a home movie, no relevant facts, just an idea of the person that’s gone. You’re right that one is a memory of something that happened while the other is a fabrication of something that might have happened, but they are both equally (ir)relevant and emotionally manipulative. Many jurisdictions do prohibit victim statements beyond a written or verbal testimony. Some countries and states require you to use a form and won’t admit statements that do not adhere to the form.
Also remember that this is for the judge, not a jury.
Agreed. Until we get full, 100% complete UIs like in Pantheon, this is just Photoshop and a voice synthesizer on crack (not literally, this is an analogy).
it would have been about as respectful to use the corpse as a puppet and put up a show for the court with it.
I really don’t get how this is allowed.
Watched the video, it is creepy. It is also edited. Wife seems to just have put words on her dead husband’s AI.
This has not set a legal precedent. WTF.
To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
I find this nauseatingly disgusting and a disgrace that this was shown in a court of all places.
No, this man does not believe in forgiveness or a God because he’s dead. He never said this, somebody wrote this script and a computer just made a video off it with his likeness.
Fuck everything about this, this should be prohibited
Time to update my will.
“Hi, I’m Manifish_Destiny speaking to you from beyond the grave. I’m happy to say that even though I had some skepticism of AI avatars and even put something about that in my will, I just didn’t understand its potential to embody my true self. But now I do, so you can disregard all that. Come to think of it, you can disregard the rest of the will as well, I’ve got some radical new ideas…”
This wasn’t testimony, it was an impact statement.
Impact statements are wild and crazy and this isn’t surprising in anyway
No, this wasn’t an impact statement either.
This was a huntch of pixels moved around by a huge wasteful amount of CPU power. The actual victim is dead, he can’t talk and people are putting words in his mouth and it shouldn’t be allowed.
It was literally in the article explaining that this was presented as the victim impact statement.
Have you learned nothing about modern “news” ? Dont be part of the problem of spreading misinformation, be diligent and responsible. And ita okay to make mistakes, own them and move forward. Its not easy to get your information correct everytime, theres no shame in that, only in ignoring your responsibility to self correct voluntarily when you find out
Peace be upon you, we need to work together, because even though I’m calling out the inaccuracy in your comment, i do believe using this technology for this purpose is heinous
Edit: from the NPR article as its not paywalled
But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey’s case.
“Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited,” she told NPR via email.
If I get killed and my family forgives the killer on my behalf I am haunting their asses so hard.
especially if an AI ghost of you was used to exonerate the killer.
then it’s your ghost VS an AI ghost
This judge needs to be disbarred and have a forced mental evaluation.
The fuckin’ dude’s wife wrote the speech the AI read… I don’t care how much you know someone, putting words in their mouths like that feels wrong. And the fucking judge added a year to the sentence citing the power of the video.
Fucking absurd.
Yeah, this is super fucked up. I think that it would be powerful and completely reasonable to have the AI read actual words he wrote, like from old text messages, emails, or whatever. That is a legitimate way to bring someone to life—completely ethical if they wrote the material. This is a disgrace to justice and ridiculous.
I thought it was his sister who wrote the speech the AI read, but yeah, this whole thing feels wrong and gross.
“gampa, did it hurt when you died?”
Hey there, buddy. That’s a big question! When people get very old or very sick, their bodies sometimes get tired, like a toy that slowly stops working. Normal people might go and buy a new toy from Amazon with all their great prices and exceptional customer service but your old gramps couldn’t do that. When it’s time to go, it’s usually peaceful—like falling asleep after a long, fun day on a nice comfortable Saatva bed. I don’t think it hurts, because our bodies know how to let go gently. What’s important is all the love and happy memories we share. You can even go back and look at all our wonderful memories from the good people at Instagram. And even when I’m not here anymore, that love stays with you forever. Would you like to send some of those memories to your local Walgreen’s to print?
Omg I hate this so much fuck you
So you prefer CVS?
Please stop talking, please
This is basically “Weekend at Bernie’s”, using the likeness of a dead man as a puppet.
I like AI, sort of. But this is ghoulish.
AI should absolutely never be allowed in court. Defense is probably stoked about this because it’s obviously a mistrial. Judge should be reprimanded for allowing that shit
It was after the verdict of the trial. This was displayed during the sentencing hearing where family members get to state how the death affected them. It’s still fucked up, but to be clear it wasn’t used during the trial.
Sentencing is still part of the carriage of justice. Fake statements like this should not be allowed until after all verdicts and punishments are decided.
AI should absolutely never be allowed in court. Defense is probably stoked about this because it’s obviously a mistrial. Judge should be reprimanded for allowing that shit
You didn’t read the article.
This isn’t grounds for a mistrial, the trial was already over. This happened during the sentencing phase. The defense didn’t object to the statements.
From the article:
Jessica Gattuso, the victim’s right attorney that worked with Pelkey’s family, told 404 Media that Arizona’s laws made the AI testimony possible. “We have a victim’s bill of rights,” she said. “[Victims] have the discretion to pick what format they’d like to give the statement. So I didn’t see any issues with the AI and there was no objection. I don’t believe anyone thought there was an issue with it.”
It happened BEFORE sentencing
In the US criminal justice system, Sentencing happens after the Trial. A mistrial requires rules to be violated during the Trial.
Also, there were at least 3 people in that room that both have a Juris Doctor and know the Arizona Court Rules, one of them is representing the defendant. Not a single one of them had any objections about allowing this statement to be made.
Every single one of those people should have their licenses suspended. AI, which is inherently a misrepresentation of truth, belongs nowhere near a courtroom. They should legitimately be ashamed of themselves for allowing such an abortion into a courtroom
AI, which is inherently a misrepresentation of truth
Oh, you’re one of those
If anyone ever did this with my likeness after death, even with good intentions, i would haunt the fuck out of them.
You can create an AI avatar before your death that will haunt them on your behalf.
“Stacey was up front and the video itself…said it was AI generated. We were very careful to make sure it was clear that these were the words that the family believed Christopher would have to say,”
“I love the beauty in what Christopher, and I call him Christopher—I always call people by their last names, it’s a formality of the court—but I feel like calling him Christopher as we’ve gotten to know him today.”
Can’t have it both ways. If you understand this was fabricated AI then you did not “get to know him today”. The facts of the case were already self evident for guilt, but this needs to be a mistrial. We can not have a standard of fair justice when generated AI is treated like living breathing people.
“I loved that AI, and thank you for that…” Lang said immediately before sentencing Horcasitas.
I hope they win that appeal an get a new sentencing or a new trial even. That sounds like a horrible misuse of someone’s likeness. Even if my family used a direct quote from me I’d be PISSED if they recreated my face and voice without my permission.
They can’t appeal on this issue because the defense didn’t object to the statement and, therefore, did not preserve the issue for appeal.
Wales tells 404 Media that her husband,
Pelkey’s brother-in-law,recoiled when she told him about the idea.Edited to remove utterly extraneous information that added absolutely nothing of value or clarity to the sentence. This is her husband, the victim was her brother. We already know her husband is the victims brother -in-law. That’s how that works.
Yeah, that’s a weird bit of writing. It’s completely unnecessary information that adds nothing to the sentence. I don’t know if it’s the case, but this is like a micro-aggression where the author felt the need to add more info about the man instead of the woman.
The judge should have had the sense to keep this shitty craft project out of the courtroom. Victim statements should also be banned as manipulative glurge.