Kristen Bell Couldnt Believe Her Face Was Used In Deep Fake Pornographic Videos I Dont

Publish date: 2024-08-03

Kristen Bell has used her amazing talent to build herself a successful career over the past two decades, and that means her face has appeared in TV shows like Veronica Mars and The Good Place along with movies like Forgetting Sarah Marshall. She’s no stranger to seeing her face on a screen, but that recently changed when her husband, Dax Shepard, told her that her face was being used in pornographic videos that were circulating online.

Bell recently told Vox that she was “shocked” when she saw the video clips because she didn’t shoot them. The videos are deepfakes and they feature someone else’s body with Bell’s face, which is something she never consented to.

Deepfake videos use machine learning to fabricate events that never happened.

But increasingly, they're used to hijack women’s faces to make porn videos they never consented to be in.@cleoabram explains:https://t.co/b3lIATb86M

— Vox (@voxdotcom) June 8, 2020

“I was just shocked, because this is my face,” said the 39-year-old. “Belongs to me!… It’s hard to think about, that I’m being exploited.”

“We’re having this gigantic conversation about consent and I don’t consent — so that’s why it’s not okay,” Bell said. “Even if it’s labeled as, ‘This is not actually her,’ it’s hard to think about that. I wish that the internet were a little bit more responsible and a little bit kinder.”

A deepfake is a video that features footage that has been manipulated by artificial intelligence to replace the image in the clip with someone else’s likeness. According to The Guardian, they are made by running photos of two different faces through an algorithm called an encoder, which finds the similarities in the two images and reduces them to features the two people have in common. The images are then compressed and the decoder recovers the faces.

While this technology could be quite the threat to the world of politics and news, 96 percent of deepfakes are actually pornographic, like Bell’s.

Almost all of these videos feature the likeness of women who haven’t consented, according to a report from Deeptrace. Henry Ajder – who wrote that report – told Vox that a lot of deepfake videos do feature celebrities. But, more and more are starting to use the images of regular people that have been taken from social media.

Australian law graduate Noelle Martin told Vox that a friend informed her that there was a pornographic deepfake circulating online with her face. She noted there is a lot of talk about the challenges that come with this kind of technology, but what isn’t often discussed is the impact it is having on individuals right now.

“Not in a few years, not in a couple of months. Right now,” said Martin.

Advertisement

“I think it’s important to not ignore red flags in the world. When new technologies start popping up, I think we’re screwed if we don’t acknowledge the detriment that it could bring to us,” said Kristen Bell.

ncG1vNJzZmiblaGyo77IrbCipqOesaa%2BjaipoGebp7a0wMSnZJudnKF6pLvUpZunrF2Xsq21xK%2BcZqCVp3qnrcKeZLCZo2LCtLHDZqCnZZSasrF5xZqinmWgpL%2Bvu8armKmgmZh6t7XDnqasZZlisbC602aaqKajmru1eZNpbGpwYmQ%3D