Skip to content
Govbase
Govbase
Congress·In Committee·H.R. 7757

Congress Proposes Major New Rules to Protect Kids on Social Media, Video Games, and AI Chatbots

KIDS Act

Legislative Progress

House
Senate
President
Law

Key Points

  • Social media companies would be required to turn on the highest privacy settings by default for users under age 17. This includes limiting who can message them and stopping the apps from automatically recommending their profiles to strangers. Parents would also get new tools to track their child's screen time and manage who they talk to online.

    From policy text

    A provider of a covered platform shall ensure that, in the case of a user of or visitor to the covered platform who the provider knows is a minor, the default setting of any safeguard described in paragraph (1) is the option available on the covered platform that provides the most protective level of control with respect to privacy and safety for such user or visitor.
    View in full text
  • Websites that primarily host adult content would have to use age-verification technology to block minors from entering. While the bill says companies cannot be forced to collect government IDs, they must use effective, commercially available methods to ensure users are adults before they can view harmful material.

    From policy text

    Nothing in this section may be construed to require the submission of government-issued identification of any individual to a covered platform or a third party contracted by a provider of a covered platform to use a technology verification measure.
    View in full text
  • The policy bans 'disappearing' or ephemeral messages for all minors and completely stops kids under 13 from using direct messaging features. Online video games would also have to include parental controls that let adults limit who their children can talk to and restrict in-game purchases.

    From policy text

    A provider of a covered platform may not offer, provide, or enable any ephemeral messaging feature of such covered platform to any covered user of the covered platform.
    View in full text
  • Companies that make AI chatbots would have to clearly tell kids they are talking to a computer, not a person. These bots would be required to suggest that users take a break after three hours of continuous use and must provide links to suicide prevention hotlines if a child mentions self-harm or a crisis.

    From policy text

    ensure that a chatbot of the chatbot provider advises a covered user of the chatbot to take a break from the chatbot at the point at which a continuous and uninterrupted interaction of such covered user with such chatbot has lasted for 3 hours
    View in full text
  • The Federal Trade Commission and state officials would have the power to fine companies that fail to follow these safety rules. Additionally, the government would launch new studies to understand how social media affects teen mental health and how drug dealers use online platforms to reach young people.

    From policy text

    In any case in which the attorney general of a State, or an official or agency of a State, has reason to believe that an interest of the residents of such State has been or is threatened or adversely affected by an act or practice in violation of this Act, the State, as parens patriae, may bring a civil action on behalf of the residents of the State
    View in full text
Technology DigitalCriminal JusticeEducationHealthcare

Impact Analysis

Personal Impact

Scores: 1 = low, 5 = highSentiment: -5 to +5 (net benefit)

Milestones

2 milestones2 actions
Mar 3, 2026House

Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.

Mar 3, 2026

Introduced in House

What Happens Next

Projected impacts based on AI analysis

90-180 days after enactment

Safe messaging rules and market research restrictions take effect

Platforms must stop offering disappearing messages to minors, ban direct messaging for kids under 13, and stop conducting market research on young users. Parents of teens 13-16 get new controls over who can message their kids.

1 year after enactment

Most new safety rules and parental tools go live

Social media platforms, video game providers, and AI chatbot companies must have all required safety features, parental controls, default privacy settings, reporting tools, and advertising restrictions fully operational. Adult content sites must have age verification systems in place.

18 months after enactment

First independent audits of platforms are due

Every covered social media platform must complete an independent third-party audit showing how well it protects minors. Results go to the FTC and are made public, giving parents and lawmakers a clear picture of which platforms are actually following the rules.

Source Information

Document Type

Congressional Bill

Official Title

KIDS Act

Bill NumberHR 7757
Congress119th Congress
ChamberHouse of Representatives
Latest ActionReferred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.

Sponsor

Analysis generated by AI. Always verify with official sources.