What’s your opinion of a judge allowing an AI-generated impact statement from a murder victim to impact his ruling?

r/

https://www.youtube.com/watch?v=tr9b1_rintk&pp=ygUVQ2hyaXMgUGVsa2V5IGFpIGp1ZGdl

An AI-generated victim statement from a murder victim was played at the trial of his killer, and the judge seems to have considered it as part of his ruling.

What are your thoughts on this?

To be clear, this wasn’t something the victim left behind like a wish, his sister made this video.

https://www.washingtonpost.com/nation/2025/05/08/ai-victim-court-sentencing/

Just to me, it seems kind of insane that 1, you can make a victim impact statement on someone else’s behalf using their likeness/perspective and 2, that judges would allow AI-generated content into a murder trial

Seems like emotional manipulation at its worst. Frankly even if someone said while alive “I’d like a statement forgiving my killer,” it feels like a dumb idea since obviously someone’s actual opinion on their murderer can’t really ever be known. It’s one thing to say one day “I think I’d forgive my killer,” it’s another to know how you’d feel about some ambiguous would-be killer if they actually did kill you

Comments

  1. AutoModerator Avatar

    The following is a copy of the original post to record the post as it was originally written.

    https://www.youtube.com/watch?v=tr9b1_rintk&pp=ygUVQ2hyaXMgUGVsa2V5IGFpIGp1ZGdl

    An AI-generated victim statement from a murder victim was played at the trial of his killer, and the judge seems to have considered it as part of his ruling.

    What are your thoughts on this?

    To be clear, this wasn’t something the victim left behind like a wish, his sister made this video.

    https://www.washingtonpost.com/nation/2025/05/08/ai-victim-court-sentencing/

    Just to me, it seems kind of insane that 1, you can make a victim impact statement on someone else’s behalf using their likeness/perspective and 2, that judges would allow AI-generated content into a murder trial

    Seems like emotional manipulation at its worst. Frankly even if someone said while alive “I’d like a statement forgiving my killer,” it feels like a dumb idea since obviously someone’s actual opinion on their murderer can’t really ever be known. It’s one thing to say one day “I think I’d forgive my killer,” it’s another to know how you’d feel about some ambiguous would-be killer if they actually did kill you

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

  2. GabuEx Avatar

    > “I feel that that was genuine,” said Todd Lang, the Maricopa County Superior Court judge who ruled in the case.

    ??????

    FFS, I do not understand how people act like AI is some sort of mystical oracle that knows things just on the basis of its say-so. I really hope this gets challenged, because we’re opening up a whoooooole can of worms if we effectively allow anything AI-created to automatically be treated as an expert witness. How long before we just start asking ChatGPT whether the defendant did it?

  3. FreeGrabberNeckties Avatar

    >it seems kind of insane that 1, you can make a victim impact statement on someone else’s behalf using their likeness/perspective and 2, that judges would allow AI-generated content into a murder trial

    Because it is. It feels like an Onion article.

    >Seems like emotional manipulation at its worst.

    Not the worst. This is just a single trial. Imagine making policy based on AI slop.

    https://www.cnn.com/2024/02/19/us/ai-generated-voices-victims-gun-control-cec/index.html

  4. -Random_Lurker- Avatar

    Might as well allow photoshopped evidence, too.

  5. TipResident4373 Avatar

    This is why AI needs to be heavily restricted in courts.

    I’d impose an outright ban on letting law enforcement even have access to generative AI tools or models whatsoever since the temptation to fabricate confessions or other evidence would be too great.

  6. BobQuixote Avatar

    Hahaha. I didn’t consider this angle at all when dreaming about what restrictions we should put on AI.

    I don’t want any AI (expert system, LLM, or actual General AI) to have persistent goals or make decisions more consequential than what advice to give to a human, except in specific cases as necessary. (Self-driving cars need to make decisions, but that makes it more important they have no persistent goals.)

    This sneaks past my idea by giving emotionally charged “advice.” I would definitely forbid AI as expert witnesses, but I think that doesn’t solve the problem; we’ll still be manipulated by this tactic in scenarios where procedure can’t save us.

  7. razorbeamz Avatar

    Ghoulish and ridiculous. AI should be banned from all courts.

  8. Tobybrent Avatar

    This judge has lost his way.

  9. Kakamile Avatar

    Shit’s fucked. Judge is allowing and humoring openly false testimony.

  10. Awayfone Avatar

    The question seems to mixing two diffrent things, so setting aside the AI for a second

    >Just to me, it seems kind of insane that 1, you can make a victim impact statement on someone else’s behalf using their likeness/perspective

    There’s nothing new about a family member or lawful representative making an impact statement from the decreased victim’s perspective, A victim has a right to be reasonably heard in court under federal and every state law. Brief video featuring the victim isn’t even novel

  11. hitman2218 Avatar

    I think it’s disgusting and I’m sad that the judge fell for it.

  12. material_mailbox Avatar

    I watched both the AI video and the judge’s reaction to it, I think it’s absolutely bonkers. Also seems really disrespectful to the victim.

  13. Weirdyxxy Avatar

    It’s deliberately being swayed by fake information. No better than knowingly taking a forgery at It’s word, in my view.

  14. SimonGloom2 Avatar

    We will have to deal with this more and more and there may be a point where we could argue something like this could be allowed. I don’t think this is something that crosses into something that should be allowed as it is a work of pure fiction and can be emotionally manipulative. It’s not the words of the victim.

    I think there will be a point, perhaps before the year is over, where we are able to use AI to get a statement that by using all data created by this person would then be used to create a statement they have a high probability of making after they are murdered. This will be likely the same AI used to create some sort of clone or alternate version of the deceased person perhaps the friends and family can use on a daily basis. Black Mirror type stuff. That’s probably going to need to hit over a 75% probability threshold of accuracy to be in theory possible to consider in court if an argument can be made for that, and over 90% it’s going to be a lot harder to argue against it. I’m not in love with any of these ideas, especially allowing this current fiction to be passable. There may be a time though even the harshest critics will struggle with how to deal with this.

    Black Mirror really has untapped court and law drama about this stuff.