Commerce, Manufacturing, and Trade Subcommittee, Judiciary Committee, Energy and Commerce Committee, Education and Workforce Committee
Introduced
In Committee
On Floor
Passed Chamber
Enacted
Digital Services Oversight and Safety Act of 2022 This bill establishes the Bureau of Digital Services Oversight and Safety within the Federal Trade Commission to provide oversight for content moderation by online platforms (e.g., social media companies). Content moderation includes actions taken by a platform to detect and address user content that is illegal or incompatible with the platform's community standards. Specifically, the commission and the bureau may conduct investigative studies concerning the dissemination of illegal content or goods through the platforms, discrimination of individuals by the platforms, and the risk of harm caused by the malfunction or intentional manipulation of the platforms. The bill also provides whistleblower protections for individuals who assist with federal investigations related to such platforms. Further, platforms must include in their community standards the policies and procedures for content moderation, and they must publish transparency reports about the platform's content moderation. Additionally, platforms with at least 10 million average monthly users must provide a complaint-handling system that allows users to appeal content moderation actions by the platform. Platforms with at least 66 million average monthly users must conduct risk assessments and report on the measures taken to minimize the risk of the potential widespread dissemination of illegal content or content that violates the platform's community standards. The bill also provides for the regulation of automated content recommendations to users of these large platforms. The bill provides for enforcement by the commission. The bill also provides for various research activities related to online content moderation.
Get AI-generated questions to help you understand this bill better
Timeline
Introduced in House
Referred to the Committee on Energy and Commerce, and in addition to the Committees on Education and Labor, and the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Subcommittee on Consumer Protection and Commerce.
Introduced in House
Referred to the Committee on Energy and Commerce, and in addition to the Committees on Education and Labor, and the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Subcommittee on Consumer Protection and Commerce.
Commerce
Digital Services Oversight and Safety Act of 2022
USA117th CongressHR-6796| House
| Updated: 2/22/2022
Digital Services Oversight and Safety Act of 2022 This bill establishes the Bureau of Digital Services Oversight and Safety within the Federal Trade Commission to provide oversight for content moderation by online platforms (e.g., social media companies). Content moderation includes actions taken by a platform to detect and address user content that is illegal or incompatible with the platform's community standards. Specifically, the commission and the bureau may conduct investigative studies concerning the dissemination of illegal content or goods through the platforms, discrimination of individuals by the platforms, and the risk of harm caused by the malfunction or intentional manipulation of the platforms. The bill also provides whistleblower protections for individuals who assist with federal investigations related to such platforms. Further, platforms must include in their community standards the policies and procedures for content moderation, and they must publish transparency reports about the platform's content moderation. Additionally, platforms with at least 10 million average monthly users must provide a complaint-handling system that allows users to appeal content moderation actions by the platform. Platforms with at least 66 million average monthly users must conduct risk assessments and report on the measures taken to minimize the risk of the potential widespread dissemination of illegal content or content that violates the platform's community standards. The bill also provides for the regulation of automated content recommendations to users of these large platforms. The bill provides for enforcement by the commission. The bill also provides for various research activities related to online content moderation.
Get AI-generated questions to help you understand this bill better
Timeline
Introduced in House
Referred to the Committee on Energy and Commerce, and in addition to the Committees on Education and Labor, and the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Subcommittee on Consumer Protection and Commerce.
Introduced in House
Referred to the Committee on Energy and Commerce, and in addition to the Committees on Education and Labor, and the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Subcommittee on Consumer Protection and Commerce.