The Take It Down Act has been introduced by Senators Ted Cruz (R-Texas) and Amy Klobuchar (D-Minnesota) to "protect children victimized by the distribution of deepfake nude and sexually exploitative images in which they are depicted."
Cruz recently told the "Fox & Friends" television program the measure does two things:
"Number one, if you post or share nonconsensual intimate images, either real images or deepfakes, it's a crime," he said. "It's a felony punishable by up to two years in prison if the victim is an adult and up to three years in prison if the victim is a child."
"Secondly, and this is a really important piece of it, it puts an obligation on the Big Tech companies that if they're notified by a victim or the victim's family, they have 48 hours to take the pictures down and to take all the copies down," the senator detailed.
The American College of Pediatricians says it supports legal efforts to eliminate the creation and dissemination of sexually explicit imagery, especially those that involve children.
"Whether those images are real or are deepfake versions created with artificial technologies, pornographic images are very harmful to the mental and emotional health of children, as are the social ramifications for victims of deepfake images," the conservative advocacy group states in a related press release.
Family Policy Alliance calls this bill a win for kids, parents, schools, and common sense.
"There's no doubt that there are dangers on the internet, but the burgeoning AI element has presented new obstacles that must be quickly countered before they get out of hand," adds Sandra Kirby, director of government affairs at the American Principles Project. "Instead of worrying about a mere catfish on a dating app, AI can now construct any scenario with any face, whether they have consented to it or not."
She points out that this leads to the possibility of fabricating local, national, and global scandals and, more personally, the production of obscene content that can radically damage a victim's personal and professional life.
"This isn't isolated to adults," Kirby notes. "It's affecting minors."
Elliston Berry, a victim of deepfake revenge porn, and her mother, Anna McAdams, joined Sen. Cruz in his interview with "Fox & Friends."
"I woke up on a Monday morning with numerous calls from my friends telling me that these fake AI photos were around," Berry accounted. "It was terrifying. I felt it was all out of my control, and I just felt helpless."
Cruz said Taylor Swift can make a request and have something taken down quickly, but that is not the case for non-celebrities like Elliston Berry.
Kirby says that is exactly why the Take It Down Act is so important, and she credits Sen. Cruz for taking care to "narrowly tailor the language so that the bill will not infringe any First Amendment rights."
Deepfake porn production reportedly increased 464% in 2023 over the prior year.