Advertisement

SKIP ADVERTISEMENT
You have a preview view of this article while we are checking your access. When we have confirmed access, the full article content will load.

Spurred by Teen Girls, States Move to Ban Deepfake Nudes

Legislators in two dozen states are working on bills, or have passed laws, to combat A.I.-generated sexually explicit images of minors.

Listen to this article · 8:36 min Learn more
Caroline and Mark Mullet sit next to each other on the concrete edge of a bed for plants outside a building labeled Issaquah High School.
Caroline Mullet, a ninth grader, prompted her father, Mark, a Washington State senator, to work on a bill to ban A.I.-generated sexually explicit images of minors. The ban is set to take effect in June.Credit...Ruth Fremson/The New York Times

Natasha Singer has covered student privacy for The Times since 2013.

Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

A few weeks later, she and other female students learned that a male classmate was circulating fake nude images of girls who had attended the dance, sexually explicit pictures that he had fabricated using an artificial intelligence app designed to automatically “strip” clothed photos of real girls and women.

Ms. Mullet, 15, alerted her father, Mark, a Democratic Washington State senator. Although she was not among the girls in the pictures, she asked if something could be done to help her friends, who felt “extremely uncomfortable” that male classmates had seen simulated nude images of them. Soon, Senator Mullet and a colleague in the State House proposed legislation to prohibit the sharing of A.I.-generated sexually explicit depictions of real minors.

“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Ms. Mullet told state lawmakers during a hearing on the bill in January.

The State Legislature passed the bill without opposition. Gov. Jay Inslee, a Democrat, signed it last month.

States are on the front lines of a rapidly spreading new form of peer sexual exploitation and harassment in schools. Boys across the United States have used widely available “nudification” apps to surreptitiously concoct sexually explicit images of their female classmates and then circulated the simulated nudes via group chats on apps like Snapchat and Instagram.


Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.


Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.

Advertisement

SKIP ADVERTISEMENT