News

NTIA Section 230 Petition Raises Significant Legal and Policy Issues at FCC

Cooley Alert
August 7, 2020

The rules proposed by the US Department of Commerce’s National Telecommunications and Information Administration in its recent petition for rulemaking asking the Federal Communications Commission to interpret Section 230 of the Communications Decency Act would have significant impact on a wide range of businesses that use the internet. At the same time, the petition raises a series of legal and policy questions about Section 230, what kind of protection Section 230 is intended to offer to internet-based businesses and the extent of government power over those businesses. This alert highlights some of those issues and the implications for internet platforms. Comments on the petition are due at the FCC on September 2 and reply comments are due on September 17.

While the NTIA petition and President Donald Trump’s May 28 Executive Order are widely understood to be aimed at social media platforms, the rules proposed in the petition would have a much wider impact. Reflecting NTIA’s legal analysis, the proposed rules are drafted broadly, and as a result, would affect any internet platform that uses or accepts third-party content, from customer reviews, comments and chat to advertising, wire-service news stories and product descriptions. The rules would open new routes for a wide variety of claims, including defamation, fraud and failure to perfectly follow a company’s own moderation rules. Companies that could be affected have the opportunity in the comment period to explain how their businesses rely on the current Section 230 regime, the potential impact of the proposed rules on their businesses and the impact on innovation and growth of US-based internet businesses in general.

NTIA starts with the assumptions that Section 230’s immunity provisions were based on outdated internet business models and were intended to allow internet businesses to address only a limited range of objectionable content – essentially only sexual and violent content and targeted harassment. It argues that the courts have interpreted Section 230 too broadly and that its scope should be narrowed considerably.

In addition, NTIA justifies new regulation by arguing that there is considerable evidence of bias in moderation, even while conceding that “few academic empirical studies exist of the phenomenon of social media bias.” It concludes that social media platforms have gained much of their influence and power because current interpretations of Section 230 protect them. It also suggests there are significant barriers to entering the social media business that give existing platforms more power. The petition does not mention TikTok or new social media platforms that have emerged in response to the perceived bias of Facebook and Twitter.

The key arguments in the petition include:

  • FCC authority: The FCC has the power to adopt rules because Section 230 was incorporated into the Communications Act by Congress. NTIA relies mostly on a late-1990s Supreme Court decision about other provisions of the Telecommunications Act of 1996.
  • The scope of Section 230: Court interpretations of Section 230 are too broad and provide protection for content that is not truly third-party content. NTIA’s principal argument for this position is that Section 230 was intended to cover only situations in which internet platforms were serving as pure conduits for information provided by users. Internet platforms would retain immunity from liability if they provided third-party content in the order a user chooses or with a default setting (but only if the default setting was disclosed and the user could turn it off).
  • Obligation to take down inappropriate content: Section 230 should not protect internet platforms that do not take down third-party content when someone objects to that content. NTIA argues that, under defamation law, failure to take down content means the internet platform is acting as a publisher, not a conduit. While this argument is focused on defamatory content, it presumably would apply to other types of objectionable content as well. NTIA does not explain how this rule would interact with its proposed limits on moderation.
  • Limits on moderation protection: The structure and legislative history of Section 230 demonstrate that the immunity for good faith moderation was intended to cover only sexual, violent and harassing content. NTIA argues that the language in Section 230 covering moderation of content that is “otherwise objectionable” should be interpreted to include only content that is like content in the other three categories.
  • When moderation is in good faith: Good faith moderation is possible only with transparency about the content moderation dispute process, including “adequate notice, reasoned explanation, or a meaningful opportunity to be heard.” Based on this conclusion, NTIA proposes a rule that would create a four-part test to determine whether a platform is moderating content in good faith. This test would, among other things, require moderation standards to be applied consistently, an “objectively reasoned belief” that the standards are consistent with Section 230, and timely notice and an opportunity to respond if content is removed or restricted based on moderation.
  • Disclosure of moderation policies: The FCC has authority under the Communications Act to require internet platforms that engage in moderation to publicly disclose their moderation policies. This authority comes through the FCC’s ancillary jurisdiction over information services and specifically under Sections 163 and 257 of the act, which empower the FCC to consider whether there are barriers to entry in the provision of communications services, including information services. These are the same provisions that the FCC relied on in its 2018 Restoring Internet Freedom Orderto require broadband internet service providers to disclose their terms and conditions of service. NTIA argues that such requirements would reduce entry barriers for content providers that wished to operate across multiple platforms and assist companies that want to develop their own filtering tools for consumers, and specifically mentions concerns that Google has blocked or reduced access to politically conservative content.
  • First Amendment: NTIA does not address any First Amendment issues raised by its proposed rules, including whether the potential limitations on immunity for moderation would result in the government favoring certain types of speech and whether the narrow definition of third-party content would have similar effects. The only mention of free speech considerations in the petition is a general statement about "First Amendment ideals" being endangered by the dominance of social media platforms.

Taken together, these arguments imagine a version of Section 230 that provides much less protection to internet platforms of all kinds and would make moderation of user-provided content much riskier than it is today. Under the proposed rules:

  • Section 230 protection would be narrowly circumscribed: Section 230 protection would be limited to unsolicited content posted by third parties and made available to other users without any intervention by the internet platform. NTIA argues that nearly any action involving third-party content, such as providing warnings about false content, prioritizing the content that users see, inviting someone to post content or choosing what third-party content to host, would eliminate Section 230 immunity. Some of these actions would eliminate Section 230 immunity not just for the specific content, but for the entire platform.
  • Section 230 immunity could be lost by failing to respond to takedown requests: To maintain immunity for claims concerning third-party content, internet platforms would have to develop and consistently implement systems to ensure that problematic content is removed from their sites in response to takedown requests.
  • Only certain moderation practices would be protected: The only forms of moderation that would be protected would be removing or restricting access to posts and suspending or banning users who violated moderation policies.
  • Moderation protection would be limited to specific categories of content: There would be no protection for moderation outside the limited bounds of the rules. For instance, under the proposed rules, moderation to remove racist comments or posts that are contrary to the underlying values of a religious site would not be protected by Section 230. While Section 230 does not itself impose liability, the loss of immunity would mean claims that previously were barred could proceed.
  • The consequences of not following published moderation policies could be severe: Any failure to follow published moderation policies would void an internet platform’s Section 230 immunity and separately subject it to sanctions for violating FCC rules.

While the proposed limitations on Section 230 could have a significant impact on internet platforms, it is perhaps more significant that NTIA has proposed that the FCC, for the first time, impose direct regulation on internet platforms by requiring them to publish their moderation policies. While the initial requirements would be limited, it would open the door to more expansive regulation in the future. Such regulation could impose additional costs on companies that depend on the internet to reach customers and users and could restrict or prohibit current business practices.

While there has been widespread opposition to the petition in the technology community, including the Consumer Technology Association, the Computer and Communications Industry Association and the Center for Democracy and Technology, and recent statements by FCC Commissioner Michael O’Rielly suggest that it may be difficult to gain approval at the FCC for this petition, there is support for changes to the Section 230 regime among both Republicans and Democrats. Consequently, businesses that could be hurt by the rules proposed by NTIA should take steps to support the current interpretation of Section 230 and the immunity it provides. Submitting comments to the FCC is the most effective way to ensure that concerns about the proposed rules are reflected in any final decision.

Cooley lawyers can assist in understanding the proposals in the petition for rulemaking and the implications of those proposals for businesses that depend on the internet and in preparing responses to the petition at the FCC and elsewhere in the federal government.

This content is provided for general informational purposes only, and your access or use of the content does not create an attorney-client relationship between you or your organization and Cooley LLP, Cooley (UK) LLP, or any other affiliated practice or entity (collectively referred to as “Cooley”). By accessing this content, you agree that the information provided does not constitute legal or other professional advice. This content is not a substitute for obtaining legal advice from a qualified attorney licensed in your jurisdiction and you should not act or refrain from acting based on this content. This content may be changed without notice. It is not guaranteed to be complete, correct or up to date, and it may not reflect the most current legal developments. Prior results do not guarantee a similar outcome. Do not send any confidential information to Cooley, as we do not have any duty to keep any information you provide to us confidential. This content may be considered Attorney Advertising and is subject to our legal notices.