JASON FYK: Publisher vs Platform– Section 230’s ‘As Applied’ Problem

Justice Thomas is correct, it is 'odd' (i.e., disharmonious) to both impose and eliminate distributor liability, in the very same statute

Justice Thomas is correct, it is 'odd' (i.e., disharmonious) to both impose and eliminate distributor liability, in the very same statute

Have you ever found yourself in a heated debate over whether or not “a Platform” can be treated as “a Publisher” of someone else’s content?

In more technical terms; can, under title 47 USC Section 230, an Interactive Computer Service Provider, who actively moderates third-party information (i.e., they are responsible in part, in a secondary publishing capacity, for the creation or development of information), be treated as an Information Content Provider, who knowingly distributed potentially harmful third-party content? It’s a question before the Supreme Court right now in Gonzalez vs Google.

If Section 230 is applied correctly (i.e., harmoniously, reconcilably, and as a whole text), the answer is surprisingly: YES, albeit with a caveat.

A Platform is generally understood to be a passive distributor of third-party information and a Publisher is generally understood to be an active content moderator or editor. A Publisher makes publishing decisions, for example to allow, disallow, or edit content, whereas a Platform is a mere conduit of third-party information. In other words, a Platform unknowingly (i.e., passively) disseminates information and a Publisher knowingly (i.e., actively) distributes information. Their differences, knowingly vs. unknowingly, or passive vs. active are very important in understanding Section 230’s distributor liability.

Justice Thomas recently noted:
“… Section 502 of the Communications Decency Act makes it a crime to ‘knowingly . . . display’ obscene material to children, even if a third party created that contentIt is odd to hold, as courts have, that Congress implicitly eliminated distributor liability in the very Act in which Congress explicitly imposed it.”

Justice Thomas is correct, it is “odd” (i.e., disharmonious) to both impose and eliminate distributor liability, in the very same statute (i.e., the CDA). To understand and apply Section 230 protections correctly, it all comes down to whether the platform knowingly (i.e., actively) or unknowingly (i.e., passively) distributes the materials at issue.

One of the original authors, Ron Wyden, said that he wanted to provide platforms with a “sword and a shield.” A shield is a “piece of personal armor” used “to provide passive protection,” and a sword is “intended for manual [i.e., active] cutting or thrusting.” One is a definitional defensive protection and the other is an authoritative offensive protection. Using a sword and a shield as an example, is great way to explain the meaningful difference between passive (i.e., unknowingly) distributing and actively (i.e., knowingly) distributing content.

Section 230 has two distinct protections, a defensive and offensive protection. 230(c)(1) is the defensive “Treatment of Publisher or Speaker” protection (i.e., the shield) and the 230(c)(2) is the offensive “Civil Liability” protection (i.e., the sword). Naturally, a “shield” provides passive (i.e., platform) protection from the actions of another (e.g.., “the publisher”) while the “sword” provides limited authority for the provider’s or user’s own actions taken against another (e.g., as “a publisher” to restrict materials).

It’s generally accepted that a platform should not be responsible for the publishing decisions of “the publisher” (i.e., another Information Content Provider) – the expressed purpose of 230(c)(1) and a platform should also have limited authority (i.e., liability protection) for its own decisions to restrict another Content Provider’s harmful material in good faith – the expressed purpose of 230(c)(2). Simplified: (c)(1) is protection from the actions of another - the “shield”, and (c)(2), is limited protection for the platform’s own actions taken against another in good faith - the “sword.”

Section 230(c)(1) must naturally be the “shield”, and applies to passive distribution - when a platform unknowingly hosts potentially harmful third-party content.

230(c)(1) specifically reads:
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

If the provider or user cannot be treated as “the other publisher”, it cannot by definition, be responsible for the other publisher’s publishing decisions.

Section 230(c)(2) must naturally be the sword, and authorizes the platform itself, to act as “a secondary publisher” (i.e., a moderator)  and protects the platform when it knowingly “considers” (i.e., knowingly acts upon and distributes “the other publisher’s”) materials in good faith, or enables a third-party to do the same.

230(c)(2) specifically reads:
“No provider or user of an interactive computer service shall be held liable on account of—
(A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).”

Simply put, 230(c)(1) “shields” the platform from being treated as, “the publisher” (i.e., as someone else) and 230(c)(2) protects the platform from liability when they use the “sword” in good faith, to restrict harmful content, at least in theory. Most people understand Section 230 this way, because it is the most natural way to read the statute, but that is not at all how the courts have actually applied 230(c)(1). Courts and the vast majority of people believe you cannot treat a platform as “a publisher” in the general sense. That’s wrong. The statute does not say they cannot be treated as “a publisher,” it specifically says “the publisher” and there is a reason for that.

In 2009, the Ninth Circuit incorrectly held Section (c)(1) “shields from liability all publication decisions, whether to edit, to remove, or to post, with respect to content generated entirely by third parties.”

All publication decisions” includes what 230(c)(2) says: “any action… taken… to restrict material… considered… objectionable.” If 230(c)(1) applies to “all publication decisions,” that includes all publication decisions described in 230(c)(2). Therefore, 230(c)(2) is redundant, if the Ninth Circuit is somehow correct, but they can’t be correct because statutes aren’t written to be duplicative.

We have to ask, why is Section 230(c)(1) misapplying to “all publication decisions”?  In my lawsuit (Fyk vs. Facebook) the Ninth Circuit Court, used what many call the Barnes 230(c)(1) immunity test.

The Barnes Test reads: “Pursuant to § 230(c)(1) of the CDA, 47 U.S.C. § 230(c)(1), “[i]mmunity from liability exists for ‘(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.’”
Simplified:
  1. They must be a platform.
  2. They are being treated as “a publisher or speaker.”
  3. And the information was provided by another.
Again, Section 230(c)(1) does not say “a publisher or speaker,” it says specifically reads “the publisher or speaker.” The Barnes test is textually inaccurate!

For example, in my lawsuit, I was not treating Facebook as “the publisher,” of my third-party content, because I was “the publisher” of my content. I was treating Facebook as “a secondary publisher” (amongst other things) of my content, because they acted as “a publisher” (i.e., used the sword against me). I was never holding Facebook accountable from my publishing decisions; I was holding them accountable for their own publishing actions. Courts are misapplying (c)(1)’s passive protection, when the shield is being actively used as an alternative (i.e., secondary) offensive weapon, rendering the purpose of the sword, superfluous. (For a visual representation of this circular process, see: Section 230’s Irreconcilable Loop below.)  
 


To be clear, a platform cannot be treated as “the publisher,” it does not say “a publisher,” therefore my original statement, is in fact correct. A platform can be treated as “a publisher,” the caveat being, it must have engaged in some type of active secondary publishing decision. In other words, it must have wielded the sword in some way.

Even Eric Goldman, who wrote an absolutely slanderous article about me, inadvertently identified Section 230’s misapplication problem; “This (Fyk vs. Facebook) is an easy Section 230 dismissal. Yet again, the court relies on 230(c)(1) for facts fitting the 230(c)(2) paradigm.” He’s correct! “Yet again, the court relie[d] on 230(c)(1) for facts fitting the 230(c)(2) paradigm.” 230(c)(1) was misapplied to what should have been a 230(c)(2) case. I got robbed of my day in court because the courts have misread and misapplied 230(c)(1).

Court have tried to reconcile this redundancy, but only further confused things. In my case (Fyk vs Facebook) the Ninth Circuit Court specifically held; “Thus, even those who cannot take advantage of subsection (c)(1), perhaps because they developed, even in part, the content at issue can take advantage of subsection (c)(2).”

There’s one huge problem with their determination, if “all publication decisions” are immune from liability under 230(c)(1) – “develop[ment], even in part” is still a “publication decision,” which would therefore, already be immune under 230(c)(1). The Ninth Circuit Court didn’t fix anything, they just quoted Facebook, dismissed my case, and messed up Section 230 even more.

What’s even more concerning, is how often this is happening. Jeff Kosseff, another so-called Section 230 “expert” noted: “[t]he vast majority of Section 230-related dismissals involve Section (c)(1), including decisions not only to keep material up, but to take it down.” In other words, the “vast majority” of content moderation cases, have been wrongly dismissed under the wrong statutory subsection! You may be wondering; what the “vast majority” is? Kosseff went on to say: In a 2020 review of more than 500 Section 230 decisions, the Internet Association found only 19 that involved the application of Section 230(c)(2).” Less than 4% of content moderation cases, have been properly applied to Section 230(c)(2)’s measure of “good faith”. Almost all of Section 230’s real problem rests in the misapplication of 230(c)(1) to 230(c)(2) cases and no one is even aware that it’s happening! It’s disastrous!

The preposition “the” in 230(c)(1) has a very important function! It’s used before the noun to “denote particular, specified persons or things.“The publisher,” in 230(c)(1), refers to “the” particular, specified publisher of “the” third-party content. Simply, “the publisher” refers to the third-party Content Provider. If the platform acts upon content, they become a secondary Information Content Provider, in part (i.e., because they are actively creating, or developing that information in part – i.e., active content consideration), who is now knowingly distributing the content. 230(c)(1) can only logically apply when the platform is uninvolved in the content consideration (i.e., unknowingly distributing third-party content).

If the courts simply give the preposition “the” effect (i.e., proper meaning and purpose), 230(c)(1) would once again prevent the platform from being treated as someone else – the” specific publisher and it would also prevent the “vast majority” of cases being wrongly dismissed, like mine was. Platforms could then finally be treated as “a publisher,” when they act as a secondary publisher (i.e., active distributor), and their action would finally be put to a measure of “good faith”! SWEET BABY RAYS, we’ve got the solution!

In sum, if the courts gave the word “the” proper effect, 230(c)(1) would only apply to passive distribution, 230(c)(2) would only apply to active distribution, and as a result, 230(c)(1) would no longer be misapplied to (c)(2) moderation cases. Section 230 would no longer eliminate “all distributor liability” and Section 230 would once again be harmonious with Section 502. Section 230 would finally be applied harmoniously! Now, it’s just up to the courts to get it right!

It should be also noted, any legislation that does not address or resolve both Section 230(c)(1)’s “as applied” and Section 230(c)(2)’s “on its face” problems, it is in effect, worthless! “The vast majority of cases” will continue to be wrongly dismissed under the wrong statutory subsection, 230(c)(1) and 230(c)(2) will continue to be exploited, for private gain.

For more information and to understand Section 230’s “on its face” problem (i.e., its unconstitutionality), please read: Private or State Action? – Section 230’s Achilles’ Heel.

With the help of my attorneys Jeff Greyber and Constance Yu, I filed a Constitutional Challenge of Section 230, see: (Fyk vs United States), and with the help of David Morgan, I’ve re-authored Section 230 to clarify the text, and to comport with the Constitution, see: (Online Freedom Act).
 

Jason Fyk is the founder of the Social Media Freedom Foundation and the Online Freedom Caucus, and co-author of the Online Freedom Act.

 
Help us fight to stop our constitutional freedoms being eroded!
Please support the Social Media Freedom Foundation!
Because we can fix Section 230!
 
 

Image: Title: 230 fyk
ADVERTISEMENT

Opinion

View All

MORGONN MCMICHAEL: President of Harvard admitted the college needs to rethink their 'communications strategy'

Concerns about the incoming administration’s attitude toward higher education come as President-Elect...

JACK POSOBIEC at AMFEST: It’s time to take America back

"Every single lie will be undone. Every single truth will be restored. Because then and only then can...