The "Responsible and Ethical AI Labeling Act," or REAL Act, mandates that Federal officials disclose when content they publish, disseminate, or release has been created or manipulated using generative artificial intelligence . This requirement aims to ensure transparency in government communications by requiring a clear, conspicuous, and prominently displayed disclaimer. The disclaimer must be in plain language, informing the public that the content was AI-generated or altered, and briefly explaining the process and technology used. Exceptions include communications not intended for public release, classified content (with a retained summary), minor graphic adjustments that do not materially alter meaning, and routine textual drafts reviewed by staff. The Director of the Office of Management and Budget must issue regulations within 180 days to ensure compliance and establish specific guidelines for disclaimer formatting. The President, Vice President, and agency heads are also required to submit annual public audits to Congress detailing compliance with these provisions. Violations of this Act necessitate corrective actions, including the retraction of non-compliant content and the issuance of a statement describing the violation. Federal officials may face disciplinary action, and contractors responsible for non-compliance could face penalties such as contract termination, reinforcing accountability for AI use in official capacities.
Referred to the House Committee on Oversight and Government Reform.
REAL Act
USA119th CongressHR-6571| House
| Updated: 12/10/2025
The "Responsible and Ethical AI Labeling Act," or REAL Act, mandates that Federal officials disclose when content they publish, disseminate, or release has been created or manipulated using generative artificial intelligence . This requirement aims to ensure transparency in government communications by requiring a clear, conspicuous, and prominently displayed disclaimer. The disclaimer must be in plain language, informing the public that the content was AI-generated or altered, and briefly explaining the process and technology used. Exceptions include communications not intended for public release, classified content (with a retained summary), minor graphic adjustments that do not materially alter meaning, and routine textual drafts reviewed by staff. The Director of the Office of Management and Budget must issue regulations within 180 days to ensure compliance and establish specific guidelines for disclaimer formatting. The President, Vice President, and agency heads are also required to submit annual public audits to Congress detailing compliance with these provisions. Violations of this Act necessitate corrective actions, including the retraction of non-compliant content and the issuance of a statement describing the violation. Federal officials may face disciplinary action, and contractors responsible for non-compliance could face penalties such as contract termination, reinforcing accountability for AI use in official capacities.