U.S. President Donald Trump has signed a new law that makes it illegal to share non-consensual and sexually explicit images online. The law covers both AI-generated Deepfakes and real photos.
The Take It Down Act (via Engadget) criminalizes "knowingly publishing" or threatening to publish fake depictions of real people in explicit situations online. It orders social media platforms to remove offensive material within 48 hours of a victim's notice.
According to Surfshark, Deepfake incidents have already started to boom in 2025. The report says there were close to 180 reported Deepfake incidents in the first quarter of 2025 alone, exceeding the total of all cases in 2024 by 19%.
Out of the 179 reports in the first quarter, 53 were explicit incidents, while online fraud accounted for 48. Political incidents rose to 40, and 38 miscellaneous reports rounded out the sum.
The new law will make it compulsory for all online platforms to create a system for removing images on request. The Electronic Frontier Foundation (EFF), a non-profit organization that protects individual liberties online, said the law had "Major Flaws."
The EFF said the new law "lacks critical safeguards against frivolous or bad-faith takedown requests." The foundation said most services will "rely on automated filters" and requests could be used to "flag legal content, from fair-use commentary to news reporting."
The short 48-hour time window also makes it harder for smaller platforms "to verify whether the speech is actually illegal," and argued that in its current form, the act forces platforms "to actively monitor speech, including speech that is presently encrypted."