The Kids Online Safety Act aims to protect minors on online platforms by imposing a comprehensive set of requirements on covered services. It establishes a duty of care for platforms to exercise reasonable care in their design and implementation to prevent and mitigate foreseeable harms to minors. These harms include eating disorders, substance use, suicidal behaviors, depressive and anxiety disorders linked to compulsive usage, severe online harassment, sexual exploitation, and exposure to illegal products like narcotic drugs, cannabis, tobacco, gambling, and alcohol. The bill mandates that covered platforms provide minors with readily accessible and easy-to-use safeguards , such as limiting communication with minors, preventing public viewing of their personal data, and restricting design features that encourage compulsive usage like infinite scrolling or autoplay. Platforms must also offer controls over personalized recommendation systems, including opt-out options, and restrict geolocation sharing. For children under 13, these safeguards must be enabled by default, and platforms must provide parents with tools to manage privacy settings, restrict purchases, and monitor time spent on the platform. To enhance accountability, the Act requires large online platforms to issue annual transparency reports based on independent, third-party audits. These reports must detail the extent of minor access, commercial interests impacting minors, data on minor usage, and an assessment of the efficacy of safeguards and parental tools. Furthermore, platforms are prohibited from conducting market research on children under 13 and require verifiable parental consent for research on minors under 17. A significant provision in Title II, the Filter Bubble Transparency section, requires online platforms using opaque algorithms to provide clear notice to users and offer an easy option to switch to an input-transparent algorithm . This ensures users can view content without manipulation based on their user-specific data not expressly provided for that purpose. The Federal Trade Commission and State Attorneys General are empowered to enforce these provisions, treating violations as unfair or deceptive acts. The bill also establishes a Kids Online Safety Council to advise Congress on emerging risks and best practices for protecting minors online. While the Act preempts conflicting state laws, it allows states to enact laws offering greater protection. Importantly, the bill clarifies that it does not require platforms to collect additional age data or implement age-gating functionalities, nor does it alter Section 230 of the Communications Act.
Forwarded by Subcommittee to Full Committee in the Nature of a Substitute (Amended) by the Yeas and Nays: 13 - 10.
Science, Technology, Communications
Kids Online Safety Act
USA119th CongressS-1748| Senate
| Updated: 5/14/2025
The Kids Online Safety Act aims to protect minors on online platforms by imposing a comprehensive set of requirements on covered services. It establishes a duty of care for platforms to exercise reasonable care in their design and implementation to prevent and mitigate foreseeable harms to minors. These harms include eating disorders, substance use, suicidal behaviors, depressive and anxiety disorders linked to compulsive usage, severe online harassment, sexual exploitation, and exposure to illegal products like narcotic drugs, cannabis, tobacco, gambling, and alcohol. The bill mandates that covered platforms provide minors with readily accessible and easy-to-use safeguards , such as limiting communication with minors, preventing public viewing of their personal data, and restricting design features that encourage compulsive usage like infinite scrolling or autoplay. Platforms must also offer controls over personalized recommendation systems, including opt-out options, and restrict geolocation sharing. For children under 13, these safeguards must be enabled by default, and platforms must provide parents with tools to manage privacy settings, restrict purchases, and monitor time spent on the platform. To enhance accountability, the Act requires large online platforms to issue annual transparency reports based on independent, third-party audits. These reports must detail the extent of minor access, commercial interests impacting minors, data on minor usage, and an assessment of the efficacy of safeguards and parental tools. Furthermore, platforms are prohibited from conducting market research on children under 13 and require verifiable parental consent for research on minors under 17. A significant provision in Title II, the Filter Bubble Transparency section, requires online platforms using opaque algorithms to provide clear notice to users and offer an easy option to switch to an input-transparent algorithm . This ensures users can view content without manipulation based on their user-specific data not expressly provided for that purpose. The Federal Trade Commission and State Attorneys General are empowered to enforce these provisions, treating violations as unfair or deceptive acts. The bill also establishes a Kids Online Safety Council to advise Congress on emerging risks and best practices for protecting minors online. While the Act preempts conflicting state laws, it allows states to enact laws offering greater protection. Importantly, the bill clarifies that it does not require platforms to collect additional age data or implement age-gating functionalities, nor does it alter Section 230 of the Communications Act.