Section 230 of the Communications Decency Act
47 U.S. Code § 230 - Protection for private blocking and screening of offensive material
(a) FindingsThe Congress finds the following:
(1)
The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.
(2)
These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
(3)
The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
(4)
The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.
(b) PolicyIt is the policy of the United States—
(1)
to promote the continued development of the Internet and other interactive computer services and other interactive media;
(2)
to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
(3)
to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
(c) Protection for “Good Samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liabilityNo provider or user of an interactive computer service shall be held liable on account of—
(A)
any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B)
any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]
(d) Obligations of interactive computer service
A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.
(e) Effect on other laws
(2) No effect on intellectual property law
Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.
(4) No effect on communications privacy law
Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.
(5) No effect on sex trafficking lawNothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—
(A)
any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title;
(B)
any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or
(C)
any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.
(f) DefinitionsAs used in this section:
(1) Internet
The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.
(2) Interactive computer service
The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
(3) Information content provider
The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
(4) Access software providerThe term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:
(June 19, 1934, ch. 652, title II, § 230, as added Pub. L. 104–104, title V, § 509, Feb. 8, 1996, 110 Stat. 137; amended Pub. L. 105–277, div. C, title XIV, § 1404(a), Oct. 21, 1998, 112 Stat. 2681–739; Pub. L. 115–164, § 4(a), Apr. 11, 2018, 132 Stat. 1254.)
DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996
DEPARTMENT OF JUSTICE’S REVIEW OF SECTION 230 OF THE COMMUNICATIONS DECENCY ACT OF 1996
Areas Ripe For Section 230 Reform
The Department identified four areas ripe for reform:
1. Incentivizing Online Platforms to Address Illicit Content
The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.
a. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.
b. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.
c. Case-Specific Carve-outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.
2. Clarifying Federal Government Enforcement Capabilities to Address Unlawful Content
A second category reform would increase the ability of the government to protect citizens from harmful and illicit conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.
3. Promoting Competition
A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
4. Promoting Open Discourse and Greater Transparency
A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.
a. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230(c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform's ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”
b. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of "good faith" should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.
c. Explicitly Overrule Stratton Oakmont to Avoid Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.
1. Incentivizing Online Platforms to Address Illicit Content
The first category of potential reforms is aimed at incentivizing platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity for defamation.
a. Bad Samaritan Carve-Out. First, the Department proposes denying Section 230 immunity to truly bad actors. The title of Section 230’s immunity provision—“Protection for ‘Good Samaritan’ Blocking and Screening of Offensive Material”—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms. It therefore makes little sense to immunize from civil liability an online platform that purposefully facilitates or solicits third-party content or activity that would violate federal criminal law.
b. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes exempting from immunity specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute.
c. Case-Specific Carve-outs for Actual Knowledge or Court Judgments. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.
2. Clarifying Federal Government Enforcement Capabilities to Address Unlawful Content
A second category reform would increase the ability of the government to protect citizens from harmful and illicit conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions brought by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.
3. Promoting Competition
A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, the avenues for engaging in both online commerce and speech have concentrated in the hands of a few key players. It makes little sense to enable large online platforms (particularly dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not on third-party speech.
4. Promoting Open Discourse and Greater Transparency
A fourth category of potential reforms is intended to clarify the text and original purpose of the statute in order to promote free and open discourse online and encourage greater transparency between platforms and users.
a. Replace Vague Terminology in (c)(2). First, the Department supports replacing the vague catch-all “otherwise objectionable” language in Section 230(c)(2) with “unlawful” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the core objective of Section 230—to reduce online content harmful to children—while limiting a platform's ability to remove content arbitrarily or in ways inconsistent with its terms or service simply by deeming it “objectionable.”
b. Provide Definition of Good Faith. Second, the Department proposes adding a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those done in accordance with plain and particular terms of service and accompanied by a reasonable explanation, unless such notice would impede law enforcement or risk imminent harm to others. Clarifying the meaning of "good faith" should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.
c. Explicitly Overrule Stratton Oakmont to Avoid Moderator’s Dilemma. Third, the Department proposes clarifying that a platform’s removal of content pursuant to Section 230(c)(2) or consistent with its terms of service does not, on its own, render the platform a publisher or speaker for all other content on its service.
No comments:
Post a Comment