December 15, 2024 / 7:59 PM EST / CBS News
In October last year, a 14-year-old girl named Francesca Mani was sitting in her high school history class when she heard a rumor that some boys had naked photos of female classmates. She soon learned her picture was among them, but the images were doctored - created with artificial intelligence using what's known as a "nudify" website or app, which turns real photos of someone fully clothed into real-looking nudes. We've found nearly 30 similar incidents in schools in the U.S. over the last 20 months and plenty more around the world. We want to warn you, some of what you'll hear - and see - is disturbing, but we think unveiling these "nudify" websites is important. In part because they're not hidden on the dark web, they are openly advertised, easy to use, and as Francesca Mani found out there isn't much that's been done to stop them.
Anderson Cooper: When you first heard the rumor, you didn't know that there were photos-- or-- or a photo of you?
Francesca Mani: No. We didn't know. I think that was like the most chaotic day I've ever witnessed.
Anderson Cooper: In a school, somebody gets an inkling of something, and it just spreads.
Francesca Mani: It's like rapid fire. It just goes through everyone. And so then when someone hears-- hears this, it's like, "Wait. Like, AI?" Like, no one thinks that could, like, happen to you.
Read the full article here.
Take the "Away For The Day" pledge and show that you support school policies that require students to put their cell phones away in lockers, backpacks, or other places all day.
Take The Pledge Here