Fake Nudes, Real Harm

The Growing Problem of Deepfake Nudes in Middle Schools and High Schools

Anonymous       ⦿       September 6, 2024

In the 1978 blockbuster Superman, the hero uses his X-ray vision to peek at Lois Lane’s panties beneath her dress, igniting a fresh new avenue of fantasy for pubescent boys everywhere. Of course, X-ray vision has never existed, and pubescent boys everywhere have been spared the perverse dilemma of being endowed with such a superpower and simultaneously forbidden to use it. And then came AI.

The advent of generative AI and so-called “nudify” technology has introduced a real-world equivalent of X-ray vision with disturbing implications. In the last year and a half, multiple cases of boys using artificial intelligence to make deepfake nudes of girls or teachers at their school have been reported in Illinois, Texas, California, New Jersey, Indiana, and Florida. While it would be naive to think that these incidents amount to a handful of isolated cases, the actual prevalence of this issue is hard to discern. Regardless, the apparent ease of access to powerful AI tools designed for this express purpose is disconcerting. According to recent reporting by the New York Times, it appears that anyone of any age can make these deepfakes in a flash, with no training or special skills required. And the harm is real, regardless of whether or not the images are disseminated. So what is being done?

At the federal level, the FBI has recently issued a public service announcement to assure the public and potential culprits alike that use of such technology on images of minors is an unqualified federal offense under current child sex abuse materials (CSAM) laws. Additionally, a bipartisan group of senators have introduced a law called the “Take it Down Act,” which would make the publication of non-consensual intimate imagery, including AI-generated pornography, a crime. It would further require apps and websites to incorporate well-defined and easily accessible procedures for removing such content. At the state and local level, some schools are creating policies to deal with such incidents should they arise, and a handful of states have made moves to criminalize such images. Compared to the pace of technology, however, the response is painfully slow. Worse still, the policies/laws are only likely to be enforced in cases where the images go public or are discovered to have been distributed privately. Thus, they function primarily to punish traffickers of such images and not so much to protect women and girls from being targeted in the first place.

We don’t need Plato’s parable of the Ring of Gyges to know that pubescent boys, whose frontal cortex is years away from being fully developed, could certainly not be trusted with X-Ray vision. Lead underwear would become the latest in a long line of burdens women shoulder to protect themselves from the omnipresence of invasive threat. But “lead underwear” is not the answer. Teaching our girls yet one more way to be on guard, to limit their freedoms, and to accept living in perpetual fear of victimization should not be part of the solution when the technology itself serves no practical or artistic purpose, nor any other tenable purpose whatsoever.

Of course, it is not only adolescent boys who would misuse a widely available “superpower.” And it is not only underage girls who are being victimized. There have been reports of adults too making deepfakes of minors. And women over the age of 18 are more or less considered “fair game,” having precious little in the way of protections. Deepfakes is already an established genre of pornography with explicit images of public figures, politicians, movie stars, pop stars, and other celebrities in wide circulation.

While creating rules and legislation to punish the individuals who create and/or circulate deepfakes is a necessary step in addressing the problem, it leaves the tools of victimization fully intact and accessible to all. “Nudify” technology and pornography enabled generative AI that can incorporate images of real people is the problem. Laws addressing emerging technologies per se, technologies that have the capacity to threaten the privacy and safety of women and girls, are urgently needed. Any legislation that leaves these tools unrestricted and their purveyors to operate with impunity leave bad actors to create lurid images privately and the law to act only upon the discovery and report of their circulation. Such incomplete legislation is unacceptable as it leaves the public at large vulnerable.

ATLANTA/DECATUR

According to open records officers with the respective agencies, there has been 1 incident reported to the Decatur Police Department of the sexual exploitation of a minor involving a modified image and [awaiting response] incidents reported to the Atlanta Police Department since August of 2020.

Georgia law ( Code § 16-11-90 (2022)) classifies the electronic transmission of nude or sexually explicit images or videos of persons over the age of 18 as a felony only if the intent is to harass or cause financial harm to the depicted individual. This includes deepfakes.

RESOURCES

If there are images—fake or real—that you need to have removed from the internet:
The National Center for Missing & Exploited Children (NCMEC) has a website called Take it Down that provides tools and information for this purpose. Click here to download a 1-page informational pdf.

If you need to report a crime or incident involving inappropriate images of a minor:
For the City of Atlanta, you can contact _____________ at ________________.
For the City of Decatur, you can contact _____________ at _________________.

What preemptive action can you take?
Parents should be sure that their adolescent children know that creating, possessing, and/or distributing inappropriate images of people under the age of 18, whether real or fake, is not only harmful but is a crime.

Parents should talk to their adolescent children about the extensive damage that is done by the creation and circulation of such images.

REFERENCES / FURTHER READING

  • American Legislative Exchange Council. “Stop Deepfake CSAM Act.” Accessed June 30, 2024. (link)
  • American Legislative Exchange Council. “Stop Non-Consensual Distribution of Intimate Deepfake Media Act.” Accessed June 30, 2024. (link)
  • CBS 42. “From Deepfake Nudes to Incriminating Audio, School Bullying Is Going AI,” June 6, 2024. (link)
  • Chavez, Bridget. “New State Deepfake Image Protection Law Goes into Effect June 6.” KIRO 7 News Seattle, April 24, 2024. (link)
  • “Child Sexual Abuse Material Created by Generative AI and Similar Online Tools Is Illegal.” Accessed June 30, 2024. (link)
  • Hadero, Haleluya. “Teen Girls Are Being Victimized by Deepfake Nudes. One Family Is Pushing for More Protections.” Theintelligencer.Net (blog), accessed June 27, 2024. (link)
  • Justia Law. “2022 Georgia Code :: Title 16 - Crimes and Offenses :: Chapter 11 - Offenses Against Public Order and Safety :: Article 3 - Invasions of Privacy :: Part   3 - Invasion of Privacy :: § 16-11-90. Prohibition on Nude or Sexually Explicit Electronic Transmissions,” July 10, 2024. (link)
  • Malenfant, Marley. “What Is the ‘Take It Down’ Act? Texas Bill Seeks to Ban Deepfake Revenge Porn Online,” June 25, 2024. (link)
  • “Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes.” Accessed June 30, 2024. (link)
  • Mulvihill, Geoff. “What to Know about How Lawmakers Are Addressing Deepfakes like the Ones That Victimized Taylor Swift.” AP News, January 31, 2024. (link)
  • NBC News. “Sen. Ted Cruz Aims to Hold Big Tech Accountable for AI Deepfakes.” Accessed June 30, 2024. (link)
  • Nickel, Dana. “AI Is Shockingly Good at Making Fake Nudes — and Causing Havoc in Schools.” POLITICO, May 29, 2024. (link)
  • Pfefferkorn, Riana. “Teens Are Spreading Deepfake Nudes of One Another. It’s No Joke.” Scientific American, June 10, 2024. (link)
  • Singer, Natasha. “Spurred by Teen Girls, States Move to Ban Deepfake Nudes.” The New York Times, April 22, 2024, sec. Technology. (link)
  • ———. “Teen Girls Confront an Epidemic of Deepfake Nudes in Schools.” The New York Times, April 8, 2024, sec. Technology. (link)
  • Take It Down. “Take It Down.” Accessed June 30, 2024. (link)
  • Tavernise, Sabrina, Natasha Singer, Sydney Harper, Shannon M. Lin, Marc Georges, Marion Lozano, Elisheba Ittoop, Dan Powell, and Chris Wood. “Real Teenagers, Fake Nudes: The Rise of Deepfakes in American Schools.” The New York Times, June 7, 2024, sec. Podcasts. (link)
  • Tenbarge, Kat, and Liz Kreutz. “A Beverly Hills Middle School Is Investigating Students Sharing AI-Made Nude Photos of Classmates.” NBC News, February 27, 2024. (link)

Site Design by Anonymous

inspired by Vivre by Automattic