How Social Media Is and Is Not Bound by the First Amendment / by Jay Barmann

twitter-bldg.jpg

by Jay Barmann
Originally appeared on Inside Social, November 19, 2018

When President Trump blocks his critics from following him on social media, is he violating the First Amendment? What about when Facebook bans someone like Alex Jones? Does Jones have a First Amendment right to spout whatever conspiracies he likes in what constitutes the modern public square? These are all questions being tackled by the Knight First Amendment Institute at Columbia University, and which resident scholar Jameel Jaffer discusses in a new Recode podcast with Kara Swisher.

The same questions have been bubbling up periodically — if not weekly — in op-ed pieces for several years. Conservative and alt-right figures who have embraced social media have often drawn public ire and have decried violations of their First Amendment rights when platforms like Twitter suspend their accounts for terms-of-use violations. The troubling question of "de-platforming" a platform like the social media site Gab — which is popular with alt-right figures and fans of Jones, and was the site of a final anti-Semitic post by Pittsburgh synagogue shooter Robert Bowers — arose two weeks ago, as GoDaddy shut down hosting services for Gab. Seattle-based Epik stepped in as a new host for Gab, and Epik's CEO Rob Monster issued a statement saying "de-platforming is digital censorship," and doing such a thing to a "haven of free speech is not about left or right."

But this is just one of many ways in which the scope of the First Amendment is being challenged as modern communications platforms make speech increasingly more public. Two decades ago, the average person who was not a journalist did not have the means to spread misinformation, incitements of violence, or hate speech across the globe with a few keystrokes. Now they do. 

Two decades ago, a President of the United States couldn't seamlessly transition from private to public life with the same Twitter account, using it both to comment on television news and conduct international diplomacy, all while trying to discredit, demean and silence his critics.

Jaffer says that the "privatization of the public square" is one issue that the Knight First Amendment Institute is actively seeking to address. The institute itself grew out of ten years of talks that Jaffer had with Columbia's president Lee Bollinger about the ways in which technology is increasingly complicating legal precedents that were set 50 and 60 years ago. As it stands, the CEOs of Facebook, Google, Twitter, Reddit, and various ISPs have within their arsenals the power to silence virtually anyone they choose to silence. In discussing the de-platforming controversy over alt-right site The Daily Stormer, Cloudflare CEO Matthew Prince wrote, "We need to have a discussion around this, with clear rules and clear frameworks. My whims and those of Jeff [Bezos] and Larry [Page] and Satya [Nadella] and Mark [Zuckerberg], that shouldn’t be what determines what should be online."

Another issue Jaffer cites, on which his colleague Tim Wu has just written a paper, is that the First Amendment is threatened by the online culture of harassment that social media has created. When someone wants to silence someone else's speech, social media has created a bevy of new virtual tools to make their lives miserable. As Jaffer puts it, "Instead of having your henchmen go to somebody’s door and threaten them, you just harass them on social media, right?"

Until the the courts begin wrapping their collective heads around social media's new role in society, many of these decisions are going to be up to CEOs — and when they get them wrong, they will face public outrage, media floggings, and potentially more federal regulation. 

Uncomfortable with having to make such decisions, Mark Zuckerberg recently reiterated an idea he floated earlier this year: a "Supreme Court" of Facebook that would be wholly independent, and would serve as an arbiter for all questions related to what constitutes censorship, free speech and violations of community standards. Given that our own, actual Supreme Court still struggles with these questions, I'd say he has his work cut out for him.

Update: As of January 28, 2019, Facebook announced it is moving forward to establish an “oversight board” that will act as a final say in content moderation disputes.