This page shows the source for this entry, with WebCore formatting language tags and attributes highlighted.

Title

Everybody will be a porn actor

Description

The article <a href="https://www.nbcnews.com/tech/internet/deepfake-porn-ai-mr-deep-fake-economy-google-visa-mastercard-download-rcna75071" source="NBC News" author="Kat Tenbarge">Found through Google, bought with Visa and Mastercard: Inside the deepfake porn economy</a> described something I'd been only vaguely aware of. <bq>Most deepfake videos are of female celebrities, but creators now also offer to make videos of anyone. A creator offered on Discord to make a 5-minute deepfake of a “personal girl,” meaning anyone with fewer than 2 million Instagram followers, for $65.</bq> Customized porn of anyone is novel to me. I'd never read it hypothesized in any of the incredible multitude of stories . <img attachment="image.jpg" align="right">Jesus, it's one thing for a celebrity like Scarlett Johansson, but can you imagine if schoolteachers have to worry about their students viewing them through the lens of the hardcore pornography they've been faked into? The boys and girls pool their money and get Ms. Jenkins on her own highlight reel. An AI facilitates the whole operation. Everyone knows that this can't be stopped. They will try. They will shut down access for everyone, they will make up sweeping rules that are far too broad, that stifle reasonable expression and creativity. But they will try to stop this from happening---and it absolutely cannot, not without turning society into an authoritarian hellscape. And, even then, they will find a way, they will just have been criminalized for doing what they absolutely are going to find a way to do, which is to see Ms. Jenkins engaged in enthusiastic intercourse. And you might say, well, Ms. Jenkins should have known what she was getting into because she's a middle-school 8 or 9 and she became a teacher anyway. But this also means that anyone can make porn of anyone. Maybe if they have more video, it helps make it more convincing, but even if they only have a picture or two, have a look online to see how well they can make that picture match up to an animated face or the face in a video. People who don't look too carefully will believe it. And someone will pay to make it because someone will think it's hilarious. <bq>“More and more people are targeted,” said Martin, who was targeted with deepfake sexual abuse herself. “We’ll actually hear a lot more victims of this who are <b>ordinary people, everyday people, who are being targeted.</b></bq> Can you imagine a job interview where the interviewer has watched fake porn of the interviewee, but they would naturally have their opinion influenced despite knowing it's fake. Porn is embarrassing, but can be explained away as too "ridiculous" to be true, but what about faking mugshots or arrests or trials? How long until there's a service for people to torpedo rivals by generating FUD that HR will believe, or that HR AI will believe? Powerful tools. Completely irresponsible herd into which they're being released. <bq>“It’s not a porn site. It’s a predatory website that doesn’t rely on the consent of the people on the actual website,” Martin said about MrDeepFakes. “The fact that it’s even allowed to operate and is known is a complete indictment of every regulator in the space, of all law enforcement, of the entire system, that this is even allowed to exist.”</bq> I understand the angry reaction, but I don't think regulation can possibly stop this. I think people will have to get less sensitive and society has to be less trusting that all content is real. Maybe a <a href="https://en.wikipedia.org/wiki/The_Light_of_Other_Days">Light of Other Days</a> quantum leap is needed. We kind of have this already with ubiquitous public filming and facial recognition. We tried to avoid it, but the relentless march of authoritarianism coupled with purely-for-profit capitalism has created surveillance states everywhere that they can afford them. Or maybe a de-pruding of society is needed, where nobody cares if you've done porn just like nobody cares if you've played softball. <bq>Martin successfully campaigned to outlaw nonconsensual deepfakes and image-based sexual abuse, but, she said, law enforcement and regulators are limited by jurisdiction, because the deepfakes can be made and published online from anywhere in the world.</bq> You won't be able to stop this unfortunately. Only an ethical increase in the worldwide population would devalue this business model, whereby people would refuse to consume faked data, which obviously isn't going to happen. Maybe we'll get something like organic-content labels?