CLEAR Act
Senate Bill Would Force AI Companies to Disclose Copyrighted Training Data or Face $2.5M Fines
Legislative Progress
Key Points
- Tech companies building AI tools would have to tell the government which copyrighted books, songs, or images they used to train their systems. This rule would apply to both new AI models and ones that are already being used by the public.
- The government would put this information into a public online database. This allows artists, writers, and musicians to see if their work was used to teach an AI without their knowledge or permission.
- If a company doesn't report their data, they could be sued by the creators of those works. A court could fine the company at least $5,000 for each work they didn't report, with a total limit of $2.5 million in fines per year.
- This policy aims to make AI development more transparent. By knowing what data is being used, the public and creators can better understand how AI models are made and ensure creators are treated fairly.
Impact Analysis
Personal Impact
Small businesses and startups building generative AI products would need to catalog and report every copyrighted work in their training datasets to the Copyright Office before releasing their models. This creates a significant compliance burden — especially for smaller companies that may lack legal teams — but also levels the playing field by applying the same transparency rules to everyone. Companies face fines of at least $5,000 per unreported work, up to $2.5 million per year, plus potential injunctions halting their AI products.
Milestones
Read twice and referred to the Committee on the Judiciary.
Sent to a congressional committee for expert review. The committee decides whether this bill moves forward.
Introduced in Senate
The bill was officially filed and given a number. It now enters the legislative queue.
Votes
No votes have been recorded for this legislation yet.
Source Information
Document Type
Congressional Bill
Official Title
CLEAR Act
Data Sources
Sponsor
Cosponsors
(1)Analysis generated by AI. Always verify with official sources.
