VET Artificial Intelligence Act
Congress Proposes New Voluntary Standards to Test and Verify Artificial Intelligence Systems
Stalled
No legislative action in over 90 days.
Legislative Progress
Key Points
- Congress is proposing a plan to create a set of voluntary guidelines to help companies test their artificial intelligence (AI) systems. The goal is to make sure AI works the way it is supposed to and does not cause unexpected harm to the public.
- The National Institute of Standards and Technology would lead the project. They would work with experts to decide how companies should check their own AI and when they should hire outside experts to double-check that the technology is safe and reliable.
- These guidelines would be voluntary, meaning companies are not forced by law to use them. However, having a standard set of rules helps build trust with the public and makes it easier for businesses to prove their technology is responsible.
- The plan includes protections for consumer privacy and business secrets. It also creates a special committee of experts from colleges, tech companies, and civil rights groups to help figure out who is qualified to perform these AI safety checks.
- If passed, the first set of guidelines would be ready within one year. After that, the government would update the rules every two years to keep up with how fast AI technology changes.
Impact Analysis
Personal Impact
Small businesses developing or using AI systems would benefit from having a clear, publicly available set of voluntary guidelines for testing and evaluating their AI. Instead of each company figuring out safety and quality checks on its own, NIST would provide a shared playbook. This is especially helpful for smaller firms that lack the resources to develop their own internal evaluation frameworks.
Broader Impacts
Milestones
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Sent to a congressional committee for expert review. The committee decides whether this bill moves forward.
Introduced in Senate
The bill was officially filed and given a number. It now enters the legislative queue.
Votes
No votes have been recorded for this legislation yet.
Related News
5 articles
Senate legislation to establish third-party AI audit guidelines is now bipartisan
The VET AI Act, introduced by Senators Hickenlooper and Capito, directs NIST to develop voluntary guidelines for third-party AI evaluations. The bill aims to create a pathway for independent evaluators to verify that AI development and testing comply with established safety guardrails.

Senate Revisits AI Testing, Development Assurance With VET AI Act
Bipartisan lawmakers reintroduced the VET AI Act in August 2025, tasking NIST with leading the development of specifications and recommendations for third-party evaluators. The bill seeks to provide independent verification of AI systems' safety, privacy, and dataset quality.
Mandated Third-Party AI Audits are Coming—Addressing AI's Socio-Technical Challenges Will Be Key
This analysis of the VET AI Act explores the shift toward independent verification in AI. While the bill proposes a framework for NIST-led voluntary guidelines, experts warn that audits must address human assumptions and socio-technical risks to be truly effective at mitigating harm.
Source Information
Document Type
Congressional Bill
Official Title
VET Artificial Intelligence Act
Data Sources
Sponsor
Cosponsors
(3)Analysis generated by AI. Always verify with official sources.