That Was Then, This Is Now: Trump Administration Proposes Section 230 Reforms
Continuing its intense focus on internet platforms’ role in political debate and the liability protections they receive under the Communications Decency Act (CDA), 47 U.S.C. § 230, the Trump Administration this week submitted a legislative proposal that would substantially limit platforms’ Section 230 immunity. The Administration’s legislative proposal joins several other bills that take aim at platform immunity. It is unclear whether any of these proposals will advance in the near term, but the proposals are a sign of sustained, bipartisan concern about the breadth of Section 230 immunity. Nonetheless, the Administration’s proposal in particular provides a look at some of the far-reaching changes to Section 230 that are under consideration.
In a transmittal letter to Congress, Attorney General William Barr writes that, while Section 230 played a “beneficial role . . . in building today’s internet, by enabling innovations and new business models, . . .” internet platforms now “use sophisticated algorithms to suggest and promote content and connect users” and “can abuse this power by censoring lawful speech and promoting certain ideas over others.”
Under current law, “interactive computer services” – a term that has been broadly read to include social media sites and other companies that allow users to post content – receive two forms of immunity. Section 230(c)(1) provides platforms with immunity (subject to limitations for criminal activity and intellectual property rights infringement) for third-party material that they host. Section 230(c)(2) provides internet platforms with immunity for voluntarily taking down or restricting access to information online, provided that they do so “in good faith” for a number of reasons enumerated in the statute (e.g., removing obscene or harassing material).
The Administration’s proposal would limit both types of immunity and could cause further changes in how platforms structure and manage their services. Three of the most significant changes are:
- No defense against civil claims by federal agencies. The Administration’s proposal would strip internet platforms of a Section 230 defense against civil charges brought by a federal agency. At least two defendants (Accusearch and LeadClick) in FTC enforcement actions have asserted, unsuccessfully, that Section 230 immunizes them from liability for unfair or deceptive practices. However, Section 230 may come up more frequently as a barrier to charging companies in the first place. The Administration’s proposal would remove this barrier.
- Increasing platforms’ “responsibility” for third-party content. Section 230 currently provides that a person or entity that is “responsible, in whole or in part, for the creation or development of information” is not entitled to Section 230 immunity for that content. The Administration’s proposal would define “being responsible” to include, among other things, soliciting, commenting upon, funding, or altering information provided by another person or entity. Although some of this is arguably consistent with Section 230 case law and Attorney General Barr’s letter asserts that this definition is intended to deny immunity to platforms that “actively choose to modify or encourage” harmful material, the language in the proposal appears to be far more expansive. For instance, “commenting” on content could include labeling content as harmful or inappropriate for certain audiences.
- Terms of service requirements. Finally, the Administration proposes to limit some of the flexibility companies have to take down content by mandating some degree of transparency and equal treatment in Section 230’s “good faith” requirement. Currently, companies are able to take down content if it falls into certain categories – such as being obscene or excessively violent – or if the content is “otherwise objectionable.” The proposal adds some new specific categories but eliminates the “otherwise objectionable” option. Specifically, to remove information in “good faith,” platforms would need to:
-
- “State plainly” their content-moderation practices in their terms of service;
- Take down or restrict access to content in a manner that is consistent with their content-moderation policies;
- Take down or restrict access to all content that is “similarly situated;” and
- Give the content provider notice of a restriction and an opportunity to respond.
Although the main focus of this proposal seems to be big social media companies, such as Twitter and Facebook, it sweeps more broadly, and could impact any company that allows users to post content on their sites. We will continue to monitor developments relating to Section 230.