Skip to content
Govbase
Govbase

TAKE IT DOWN Act

May 19, 2025 – April 8, 2026

Where Things Stand

The federal government is now enforcing the TAKE IT DOWN Act to punish people who share fake AI-generated images and private photos without consent. This law makes it a federal crime to distribute these digital forgeries and requires websites to remove them within 48 hours. In April 2026, federal officials secured the first conviction under this law against an Ohio man, following a 2025 legislative push that avoided typical funding fights.

Key Statements

FFBI Cincinnati

Ohio man pleads guilty to cyberstalking exes, creating AI-generated obscene material of adults and children... believed to be first defendant in nation convicted of violating the Take It Down Act.

This post confirms the first successful federal prosecution and conviction under the new law.

UU.S. Department of Justice

The defendant is the first in nation convicted of violating the Take It Down Act championed by @FLOTUS, and signed into law by @POTUS last year.

This official statement links the conviction to the specific legislation and the administration's enforcement efforts.

Who This Affects

5 groups

Hurts

Criminal Record

People convicted under this law face up to 2 years in prison for offenses involving adults and up to 3 years for offenses involving minors, plus criminal forfeiture of property and mandatory restitution. A federal conviction for publishing nonconsensual intimate images would create a permanent criminal record with serious long-term consequences for employment and housing.

Mixed

Small Business Owner

Small businesses that operate websites or apps hosting user-generated content — such as forums, social platforms, or dating apps — must now build a notice-and-removal system that meets the law's requirements within one year. While the law protects platforms acting in good faith from liability for removing content, compliance costs could be a burden for smaller operators.

Gig Worker

Gig workers who create or distribute digital content — including freelance developers building apps with user-generated content features — may need to ensure the platforms they work on comply with the new notice-and-removal requirements. Those involved in AI-generated content creation also face new legal boundaries around deepfake intimate imagery.

Helps

Student

Young people, especially teens and college students, are among the most common targets of nonconsensual intimate images and AI-generated deepfakes. This law gives them a clear path to get harmful content removed within 48 hours and makes it a federal crime to share or threaten to share such images, providing stronger protections against cyberbullying and sexual exploitation.

Lgbtq

LGBTQ individuals face disproportionate risks of being targeted with nonconsensual intimate images, sometimes used as tools of harassment, outing, or blackmail. This law provides federal criminal protections and a mandatory platform removal process that can help limit the spread of such content and deter potential abusers.

Policies

The TAKE IT DOWN Act, also known as HR 1768, was signed into law in May 2025 to create federal penalties for digital exploitation. It established the first legal framework for criminalizing AI-generated deepfakes and forced tech platforms to take immediate action on reported content.

News

Melania Trump touts first conviction under AI deepfake abuse law

washingtonexaminer.com logoWashington ExaminerCenter Right

Girls at elite prep school threatened in 'revenge porn blast' as parents shell out $63k a year to attend

news_articleRight

Political Response

0 statements

Analysis generated by AI. Always verify with official sources.