Skip to content
Congress·In Committee·3 months ago

Senate Bill Targets AI-Powered Voice and Video Scams Impersonating Government, Businesses

Also known as: Artificial Intelligence Scam Prevention Act

Legislative Progress

Filed
Review
Senate
House
President

Key Points

  • Makes it illegal to scam people by pretending to be a government office, a business, or an official—especially using fake voices or fake images made by artificial intelligence.
  • Lets the Federal Trade Commission go after people who run these scams and also people who knowingly help them pull it off.
  • Requires callers and texters to clearly tell you at the start if a call or text is using artificial intelligence to sound like a real person.
  • Expands scam rules beyond phone calls to include text messages and video calls, so scammers can’t dodge the law by switching platforms.
  • Pushes better reporting and public warnings, including a scam info website and a joint group to help retailers, banks, and wire services spot gift card and transfer scams.
Consumer ProtectionArtificial IntelligenceTelecommunicationsCriminal Justice

Milestones

2 milestones2 actions
Dec 16, 2025Senate

Read twice and referred to the Committee on Commerce, Science, and Transportation.

Dec 16, 2025

Introduced in Senate

What Happens Next

Projected impacts based on AI analysis

After the law takes effect and companies update their systems

AI disclosures start appearing at the beginning of some automated calls and scam-like texts

You may hear or see a clear notice that AI is being used, making it easier to spot a message that might be trying to trick you.

As soon as practical after enactment

FTC updates its scam reporting website to make AI scams easier to find by region and scam type

It should be easier to look up current scam patterns near you and quickly find where to report a scam or get help.

After agencies put the new complaint-handling steps in place

New procedures make scam complaints get logged and shared faster with law enforcement

If you report a scam, your report is more likely to be routed quickly to the right investigators, which can help stop repeat scammers sooner.

Within about 270 days after enactment (roughly 9 months)

FCC issues rules to carry out the robocall and scam-text changes

Phone and messaging networks may change how they detect, label, or block illegal robocalls and high-volume scam texts.

Within about 180 days after enactment (roughly 6 months), then yearly

FTC releases a public report tracking AI-enabled scams and trends like voice cloning

You may see clearer national warnings about the newest scam tricks, and lawmakers may use the report to push stronger protections later.

Likely within the first year after enactment, then updates over time

Advisory group publishes model training and warning materials for stores and money transfer services

More stores may post scam warnings near gift cards and train workers to pause suspicious purchases, which can prevent losses.

Five years after enactment

Advisory group sunsets after five years

The extra coordination group would end unless renewed, so updates may slow down later unless agencies keep the work going.

Related News

2 articles

Source Information

Document Type

Congressional Bill

Official Title

Artificial Intelligence Scam Prevention Act

Bill NumberS 3495
Congress119th Congress
ChamberSenate
Latest ActionRead twice and referred to the Committee on Commerce, Science, and Transportation.

Sponsor

Cosponsors

(1)
R: 1

Analysis generated by AI. While we strive for accuracy, this should not be considered legal or professional advice. Always verify information with official government sources.