Have you been targeted by nonconsensual deepfake pornography

By Clare Duffy, CNN New York (CNN) — Rapid advancements in artificial intelligence have made it easier for bad actors to create non-consensual deepfake pornography. While “revenge porn” – a form of harassment that involves posting explicit images of someone online without their consent – has been an issue for years, new AI tools make it possible to artificially generate nude images that look strikingly real.Gay porno Women around the world, from Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school girls in New Jersey and Texas, have been targeted by non-consensual, AI-generated sexual images in recent months. Social media companies are facing new pressure to address the spread of non-consensual images. And lawmakers are now scrambling to find a way to crack down on the people who create such images and the online tools that enable them. Have you or your child been the target of this form of harassment? We want to hear about what it meant for your life and how (and if) law enforcement or school officials responded. Tell us about your experiences in the form below. Providing commentary anonymously can be discussed. The-CNN-Wire™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Jump to comments ↓

News Channel 3-12 is committed to providing a forum for civil and constructive conversation. Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here If you would like to share a story idea, please submit it here. Terms of Service Privacy Policy Community Guidelines FCC Public File EEO Public File Report FCC Applications Do Not Sell My Personal Information Daily News Update

Weather Forecast

Breaking News

Severe Weather

Contests & Promotions Accessibility Tools

Leave a Comment

Your email address will not be published. Required fields are marked *

4 + 3 =