
The advancement of technology has created a realm of abuse in the creation of and easier access to deepfake imagery.
Computer-generated nonconsensual intimate imagery, commonly known as “deepfakes,” are images or videos that have been manipulated with artificial intelligence to make it look or sound like someone did or said something they never actually did. Though both genders have been victims of deepfake pornography, women and girls are primarily the targets.
“Utahns should not have to worry about predators stealing their images and creating deepfake pornography. It is alarming how many high school students across the country have become victims of this exploitation and harassment,” Utah Attorney General Derek Brown said on X. “As A.I. develops, we must stay ahead of the curve and protect our children from this disturbing trend.”
Brown co-led a bipartisan coalition of 47 attorneys general urging search engines like Google and Yahoo to implement stricter guardrails, preventing abusers from creating and selling abusive content.
“Search engines already limit access to harmful content such as searches for ‘how to build a bomb’ and ‘how to kill yourself,’” per the press release. “The attorneys general urged these companies to adopt similar measures for searches such as ‘how to make deepfake pornography,’ ‘undress apps,’ ‘nudify apps,’ or ‘deepfake porn.’ The coalition also urged payment platforms to deny sellers the ability to use their services when they learn of connections to deepfake NCII tools and content and remove those sellers from their network.”
Youth are very aware of deepfakes
Not all deepfakes pose risks; some are used for harmless humor, but a 2023 report found that 98% of deepfake content online is pornographic.
The report by Home Security Heroes also found that in just one year — 2022 to 2023 — the amount of deepfake porn online went from 3,725 to 21,019, a 464% increase. It also noted that 99% of all victims of deepfake porn online are women.
In their letters to the legal representatives of Google Search, Microsoft Bing and Yahoo! Search, the attorneys general emphasized the damages that can be caused by being a victim of a deepfake.
“There are mobile applications and web applications that are either free or low cost to download and create deepfakes which could be then used maliciously in scenarios like cyberbullying,” according to the Department of Homeland Security. “The use of deepfakes in cyberbullying cases will likely increase and become more of a threat as time goes on, especially for younger generations who frequently use technology and social media.”

Another study found that many young people are aware of deepfakes and know someone who has been a victim of it.
“Among teens, 1 in 10 (10%) reported personally knowing someone who had deepfake nude imagery created of them, and 1 in 17 (6%) disclosed having been a direct victim of this form of abuse,” per a Thorn report.
Among the 1,040 respondents aged 9 to 17, 11% reported believing their friends or classmates had created a deepfake, while another 10% said they would rather not say.

Comments