Fathom Letter to the House Energy & Commerce Committee: On Private AI Governance
Letters
•
RE: Private AI Governance
Dear Chairman Guthrie, Ranking Member Pallone, Subcommittee Chairman Bilirakis, and Subcommittee Ranking Member Schakowsky,
Thank you for holding this important hearing on AI Regulation and the Future of US Leadership. We applaud the Committee’s continued focus on identifying and addressing gaps in artificial intelligence (AI) policy and your continued efforts to inform the public on the evolution of AI technology.
Related Insights
Research
•
Events
Designing Trustworthy Public-Private Verification Frameworks for AI Governance
Feb 24-26, 2026
•
IASEAI Annual Conference 2026
Analysis from a Full-Day Workshop on Independent Verification Organizations (IVOs)
Events
•
Research
The Ashby Workshops: 2026 Report
Feb 2-4, 2026
•
The Ashby Workshops 2026
Designing Our Collective AI Future: Insights from the leaders shaping the future of AI across sectors and disciplines
News
•
Events
Economic Times: AI safety hangs in balance as India rushes in
Feb 16-20, 2026
•
India AI Impact Summit 2026
In the past two years, new nonprofit groups have emerged to address AI risks. Organisations like Fathom, Current AI, and IASEAI work with governments and industry to create safeguards. As AI spreads, safety is becoming a core concern, with researchers and policymakers joining efforts to manage its impact.
Related Insights
Research
•
Events
Designing Trustworthy Public-Private Verification Frameworks for AI Governance
Feb 24-26, 2026
•
IASEAI Annual Conference 2026
Analysis from a Full-Day Workshop on Independent Verification Organizations (IVOs)
Events
•
Research
The Ashby Workshops: 2026 Report
Feb 2-4, 2026
•
The Ashby Workshops 2026
Designing Our Collective AI Future: Insights from the leaders shaping the future of AI across sectors and disciplines
News
•
Events
Economic Times: AI safety hangs in balance as India rushes in
Feb 16-20, 2026
•
India AI Impact Summit 2026
In the past two years, new nonprofit groups have emerged to address AI risks. Organisations like Fathom, Current AI, and IASEAI work with governments and industry to create safeguards. As AI spreads, safety is becoming a core concern, with researchers and policymakers joining efforts to manage its impact.
Related Insights
Research
•
Events
Designing Trustworthy Public-Private Verification Frameworks for AI Governance
Feb 24-26, 2026
•
IASEAI Annual Conference 2026
Analysis from a Full-Day Workshop on Independent Verification Organizations (IVOs)
Events
•
Research
The Ashby Workshops: 2026 Report
Feb 2-4, 2026
•
The Ashby Workshops 2026
Designing Our Collective AI Future: Insights from the leaders shaping the future of AI across sectors and disciplines
News
•
Events
Economic Times: AI safety hangs in balance as India rushes in
Feb 16-20, 2026
•
India AI Impact Summit 2026
In the past two years, new nonprofit groups have emerged to address AI risks. Organisations like Fathom, Current AI, and IASEAI work with governments and industry to create safeguards. As AI spreads, safety is becoming a core concern, with researchers and policymakers joining efforts to manage its impact.
Independent.
Nonpartisan.
Nonprofit.
Fathom is a 501(c)(3) organization funded by philanthropists. We do not take donations from corporations, including frontier labs and the FAANG companies, or foreign entities associated with countries of concern.
Independent.
Nonpartisan.
Nonprofit.
Fathom is a 501(c)(3) organization funded by philanthropists. We do not take donations from corporations, including frontier labs and the FAANG companies, or foreign entities associated with countries of concern.
Independent.
Nonpartisan.
Nonprofit.
Fathom is a 501(c)(3) organization funded by philanthropists. We do not take donations from corporations, including frontier labs and the FAANG companies, or foreign entities associated with countries of concern.