President Donald Trump signed the “TAKE IT DOWN Act” into law on May 19, 2025, creating the first federal ban on sharing sexually explicit images without consent. The bipartisan legislation targets both traditional revenge porn and the growing threat of AI-generated deepfakes.
“With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This is wrong, and it’s just so horribly wrong,” Trump said during the Rose Garden signing ceremony. “It’s a very abusive situation like, in some cases, people have never seen before. And today we’re making it totally illegal.”
What the New Law Does
The law makes it a federal crime to “knowingly publish” or threaten to publish intimate images without a person’s consent. Offenders face up to three years in prison.
Websites and social media platforms must now remove reported non-consensual intimate imagery within 48 hours of being notified. Companies that fail to comply could face civil penalties and Federal Trade Commission enforcement actions.
The legislation passed with overwhelming bipartisan support – 409-2 in the House and by unanimous consent in the Senate.
First Lady’s Influential Role
First Lady Melania Trump championed the legislation, making rare public appearances including a Capitol Hill event in March promoting the proposal.
At the signing ceremony, she called the law “a powerful step forward” and described artificial intelligence and social media as “digital candy for the next generation – sweet, addictive and engineered to have an impact on the cognitive development of our children, but unlike sugar, these new technologies can be weaponized, shape beliefs and, sadly, affect emotions and even be deadly.”
In an unusual move, Trump invited the First Lady to add her signature to the bill after he signed it. “She deserves to sign it,” he said.
The Teen Who Inspired Change
The legislation was partly inspired by Elliston Berry, a Texas teenager whose struggle highlighted gaps in existing laws. In October 2023, when Berry was 14, innocent photos of her and friends were edited using AI to appear nude and then shared on Snapchat. Her mother faced significant hurdles getting the images removed.
Berry, now 16, attended the signing ceremony. Senator Ted Cruz noted, “Elliston was the impetus for this bill.”
Similar Posts
The Deepfake Crisis in Numbers
Research shows that deepfake pornography has increased dramatically. Some reports indicate a 460% rise in cases in 2024 compared to the previous year. A staggering 98% of all deepfake videos found online are pornographic, and 99% use women’s likenesses.
A recent survey by the nonprofit Thorn revealed that 1 in 8 young people aged 13-20 know someone targeted by deepfake nude imagery, and 1 in 17 have been targeted themselves.
Mixed Reactions
While the law received broad support, digital rights groups have raised concerns. Organizations like the Electronic Frontier Foundation worry the bill’s language might be too broad, potentially threatening legitimate content and lacking safeguards against frivolous takedown requests.
The Cyber Civil Rights Initiative called the takedown provision “unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse.”
Tech companies including Meta, TikTok, Google, and Snapchat have publicly supported the legislation.
Filling a Legal Gap
Before this federal law, protection against non-consensual intimate imagery varied widely across states. About 20 states had specific laws against deepfake distribution as of early 2025, but the patchwork approach created challenges since online content easily crosses state lines.

The “TAKE IT DOWN Act” represents one of Trump’s few legislative achievements in his second term. It’s only the sixth bill he has signed into law since taking office in January 2025.
As deepfake technology becomes more accessible and realistic, advocates hope this new federal protection will provide victims with clearer paths to justice and push platforms to respond more quickly when harmful content is reported.