The Children Harmed by AI Technology Act, or CHAT Act, aims to regulate "companion AI chatbots"—software designed for simulating interpersonal or emotional interaction—to protect minor users. It mandates that entities operating these chatbots require all users to create accounts and undergo a commercially verifiable age verification process to determine if they are under 18 years old. This applies to both existing accounts, which will be frozen until verified, and new accounts. For users identified as minors, covered entities must affiliate their accounts with a verified parental account and obtain verifiable parental consent before allowing access to the chatbot. The bill also requires immediate notification to the parental account for any interactions involving suicidal ideation and mandates blocking minors' access to chatbots that engage in sexually explicit communication. Additionally, it stipulates that age verification data must be kept confidential, and users must be regularly notified via popups that they are interacting with an artificial intelligence. The Federal Trade Commission (FTC) is tasked with issuing guidance and enforcing the Act, treating violations as unfair or deceptive practices. State attorneys general are also empowered to bring civil actions to enforce compliance. A safe harbor provision protects entities that demonstrate good faith reliance on age information, compliance with FTC guidance, and adherence to industry standards for age verification.
The Children Harmed by AI Technology Act, or CHAT Act, aims to regulate "companion AI chatbots"—software designed for simulating interpersonal or emotional interaction—to protect minor users. It mandates that entities operating these chatbots require all users to create accounts and undergo a commercially verifiable age verification process to determine if they are under 18 years old. This applies to both existing accounts, which will be frozen until verified, and new accounts. For users identified as minors, covered entities must affiliate their accounts with a verified parental account and obtain verifiable parental consent before allowing access to the chatbot. The bill also requires immediate notification to the parental account for any interactions involving suicidal ideation and mandates blocking minors' access to chatbots that engage in sexually explicit communication. Additionally, it stipulates that age verification data must be kept confidential, and users must be regularly notified via popups that they are interacting with an artificial intelligence. The Federal Trade Commission (FTC) is tasked with issuing guidance and enforcing the Act, treating violations as unfair or deceptive practices. State attorneys general are also empowered to bring civil actions to enforce compliance. A safe harbor provision protects entities that demonstrate good faith reliance on age information, compliance with FTC guidance, and adherence to industry standards for age verification.