Skip to content

Latest commit

 

History

History
90 lines (63 loc) · 8.26 KB

session-3.md

File metadata and controls

90 lines (63 loc) · 8.26 KB

17 Feb: Platforms and the public sphere

Topics

  • Transition from an analog to a digital public sphere, with speech and associational rights regulated by companies; virality over veracity in online discourse; tensions between quantity and quality of information; implications for democracy
  • Business model concerns, including new conceptions of monopoly and market power of digital platforms, as well as government efforts to promote market competition (e.g., antitrust regulation)
  • Technology behind efforts to regulate speech in online communities, including content moderation practices, frontiers/innovations in speech regulation
  • Comparative analysis of how global platforms operate in diverse communities with different speech traditions and politics

Table of contents

Content moderation

Content moderation is a really hard problem.

  • Obvious to us in tech, but not to ordinary citizens
  • What makes speech valuable? What about technology that changes that?
  • The rise of private superpowers not beholden to the first amendment: Facebook and Twitter’s policies on expression is more consequential than France's.
  • There are more options than a binary show/don't show when it comes to content moderation
    • Subject content to fact-checking, warning messages, lower its reach through visibility filtering, user timeouts, etc.
    • Whether content exists is different from whether content is seen

Salesforce has taken unspecified action to stop the RNC from sending messages from possibly inciting violence. How far up the stack or down the stack should content moderation go? Telecom companies? Telecom companies are already supposed to be content moderators with regard to spam and robocalls…

”Awful but lawful” must be codified as “lawful” is a very low bar with regard to content moderation.

There is an unprecedented level of scope to community standards on social media.

When you can't define the science behind the “art” of moderation, how do you move forward?

De-platforming, banning, suspending

We’ve seen three approaches to platform restrictions:

  • Banning an individual from a platform (e.g., Trump banned from Twitter, FB)
  • Removing an app from the App Store or Google Play Store (e.g., Parler app removed from both stores)
  • Suspending hosting for an app/social network (e.g., AWS cutting Parler’s cloud hosting)

…Are these approaches equally valid? If not, why not?

Deplatforming Parler was a deplatforming of a platform. Apple asked Parler to provide a content moderation plan within 24 hours before kicking them off. Will Apple ask that of others? Is Apple now a content moderation reviewer?

Should we think of AWS as similar to or different than the major social media platforms with respect to the power/right they should have to deny access to their services?

How should the app store or cloud service provider determine whether or not to allow an app? Should it be based on the app’s stated Terms of Service or how the app is actually being used?

International human rights standards on incitement differ from existing content moderation policies… and international human rights standards value freedom of expression.

Will fringe audiences radicalize further as the large forum public spaces online disallow their conversation?

In a polarized society, people will look to the individuals with the most power and influence and thus the greatest ability to cause harm and further division.

Government actors

Is it justifiable for a platform to handle accounts of public officials differently than those of the general public? Should this policy be different in democracies than it is in non-democracies? How so?

  • Political officials that have an incentive to stay in power despite not being democratically elected should be held to the same standard citizens are, without special treatment (fact checking, rule breaking, etc.)

Should Twitter join Facebook’s review board? Mark shouldn’t decide things alone, and neither should Jack.

Does a review board remove accountability from the company itself? Is that a good thing? The review board has a different incentive structure than business needs. The review board is an Appeal Court, not a Decision Body. Is public transparency in the board — its members, its budget, its process — even more important than what the board does?

What are decisions that courts should make vs. decisions that companies should make?

Historically, decisions and documents in major decisions are internal, which means there is a lack of public transparency and oversight.

Reading list

Supplementary reading