IA Calls On Supreme Court To Weigh In On Section 230’s Crucial Role In Content Moderation
“Online content moderation efforts enabled by Section 230 make the best of the internet possible and help people enjoy safe experiences online”Jon Berroya, IA Interim President and CEO
Washington, DC — Today, Internet Association (IA) filed an amicus brief in Enigma Software Group USA, LLC v. Malwarebytes, Inc., calling on the Supreme Court to grant certiorari in recognition of the critical importance of Section 230 in enabling online platforms to create safe and enjoyable experiences online. The brief demonstrates that Section 230’s subsection (c)(2) allows platforms to write and enforce codes of conduct, undertake robust content moderation efforts, and empower users to decide what content they wish to see online, without that activity creating additional legal liability.
“Online content moderation efforts enabled by Section 230 make the best of the internet possible and help people enjoy safe experiences online,” said IA Interim President & CEO Jon Berroya. “Section 230 ensures all online platforms can create and enforce codes of conduct and protect their users without fear of liability. It’s a critical time for the Supreme Court to protect online platforms’ ability to give users the tools to control their own online experiences.”
In support of the request that the Supreme Court review the case, the brief outlines how Section 230(c)(2) ensures platforms can develop tools for flexible content moderation without fear of liability and the risks to these efforts posed by the Ninth Circuit’s decision:
- Section 230(c)(2) was designed to protect online service providers for their efforts to moderate online content. From the filing:
- “Congress intended the statute to spur the development of tools for screening objectionable content, and it sought to encourage those efforts by protecting online service providers from liability for claims based on those efforts.”
- Section 230(c)(2) allows platforms to remove unlawful and offensive content without fear of liability. From the filing:
- “Most online service providers, including all of IA’s members, have adopted policies prohibiting various forms of material or activities they deem harmful, inappropriate, or improper… These rules are essential to protecting the provider’s ability to reliably offer safe, secure, and functional services. Without them, online platforms would often become inhospitable places, where harmful and offensive material might drown out higher-quality speech.”
- Section 230(c)(2) encourages platforms to develop tools that empower users to curate their own online experiences. From the filing:
- “Beyond protecting the efforts of online service providers to directly block or remove objectionable material, Section 230 also facilitates valuable content moderation in another way. Subsection (c)(2)(B) protects service providers for making available tools that enable their users to curate their online experience or avoid content they may not want.
- The Ninth Circuit’s decision undermines the goals of Section 230 and threatens valuable content moderation tools. From the filing:
- “The majority’s holding means that a service provider may lose immunity for providing an otherwise protected filtering tool based on the mere allegation that the service provider (or user) allegedly acted with an improper motive or purpose.”
To read the full brief, click here.