Question: I’ve been hearing a lot about deepfakes and how apps are letting people create naked images of other people. What is this? Is it pornography and/or AI (artificial intelligence)?
Answer: This is an important question about a serious topic. Here are some helpful explanations:
Although generative AI (artificial intelligence that can generate text, images, or sound) has many potentially positive uses, it can be used in negative ways too, especially in the form of deepfakes. Deepfakes are photos or videos that have been manipulated to show an individual doing something that they did not do. People can fabricate deepfakes online by using AI, in which computer-generated networks of interconnected data (both old data and recently manufactured data) are blended together to create novel media that seems real but is actually fake. These kinds of media often show up in politics and in the porn industry, where women’s or children’s faces are often masked onto different bodies that are not their own, making an illusion that is harmful and causes non-consensual, image-based sexual abuse. Intimate deepfakes can also be made by so-called “undressing apps” in which a person uploads a picture of someone and the app generates a fake nude photo. These apps have been used by teens to abuse or bully others.
Child sexual abuse material (CSAM) is images or videos of children who are nude, performing sexual acts or being abused. It is illegal and technology companies continuously work to identify it and take it down. Since generative AI was invented, some predators have been using it to create deepfake CSAM. This creation, display, and distribution of fake sexual images or videos can lead to trauma, victim-shaming, and reputation damages for youth who were not involved in these situations but are unfortunately tied to them due to deepfake technology.
Another form of media in this industry is synthetic pornography (SP), which is different than deepfakes. Deepfakes include actual people’s identifying characteristics on bodies other than their own, while synthetic pornography involves AI-generated, non-actual bodies engaging in sexual activities. Despite SP not showing actual children’s faces, it is important to know that this form of media can still have children’s bodies in it and can potentially be harmful.
Deepfakes, synthetic pornography, and CSAM are all virtual image-based sexual abuse. Dissemination of this explicit content may occur online via email, social media, or pornographic sites, where children may access it.
To learn more about the potential impacts of AI-altered images, see our portal response The impact of deepfakes, synthetic pornography & virtual child sexual abuse material.
If you’re a parent or caregiver of a child involved in AI-generated, image-based sexual abuse, here are tips on how to move forward and resources that may be helpful.
References
- Eelmaa, S. (2022). Sexualization of Children in Deepfakes and Hentai. Trames, XXVI(2), 229–248.
- Okolie, C. (2023). Artificial Intelligence-Altered Videos (Deepfakes) and Data Privacy Concerns. Journal of International Women’s Studies, 25, 13.
- Kim, S., & Banks, J. (2024). Expanding experiences and anxieties: Gender-identity and sexual-orientation differences in attitudes toward synthetic pornography. Porn Studies, 1–19.
- Tenbarge, K. (2024, July 24). Defiance Act passes in the Senate, potentially allowing deepfake victims to sue over nonconsensual images. (2024, July 24). NBC News.
Age: 10-24
Topics: deepfakes, synthetic sexual images, synthetic pornography, sexual abuse, sexual violence, image-generative AI
Role: Parent
Last Updated
03/12/2025
Source
American Academy of Pediatrics