Skip to main content

TWG: Emphasize Transparency and Accountability of Digital Platforms

Imposing and enforcing transparency and accountability requirements on digital platforms would be less intrusive and less threatening to free speech rights than strict content regulation, according to an analysis from the Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression in a new working paper.

The paper, “Transparency Requirements for Digital Social Media Platforms,” by Mark MacCarthy of Georgetown University, advocates a tiered system of transparency. He departs “from the movement in many countries for content regulation and mandated takedowns, preferring instead to focus on creating a balanced and clear legal structure for disclosure that can help to restore public trust in digital platforms and provide assurances that they are operating in the public interest.”Transatlantic Working Group logo

In this third and final set of working papers, members of the Transatlantic Working Group (TWG) also examine the use of artificial intelligence in content moderation, and dispute resolution among digital platforms involving social media councils and e-courts. The papers were developed at the group’s session in November 2019, hosted by the Rockefeller Foundation Bellagio Center in Como, Italy.

“Now more than ever, collaboration is essential between government, the public, and tech companies to ensure the resilience of democracy throughout the information ecosystem,” said TWG co-chair Susan Ness. “The Bellagio session capped an intensive but gratifying yearlong journey for the Transatlantic Working Group toward forging a framework to achieve that end.”

Launched in early 2019, the TWG seeks to reduce online hate speech, terrorist extremism, and viral deception while protecting freedom of expression. The group, a project of the Annenberg Public Policy Center (APPC) of the University of Pennsylvania in partnership with the Annenberg Foundation Trust at Sunnylands and the Institute for Information Law (IViR) of the University of Amsterdam, is co-chaired by Ness, a distinguished fellow of APPC and former member of the Federal Communications Commission, and Marietje Schaake, president of the CyberPeace Institute and former Member of the European Parliament. Additional support has been provided by the Embassy of the Kingdom of the Netherlands.

Download the TWG co-chairs report from the Bellagio session.

The group’s final report will be published this spring.

Working papers: Online harms, disinformation, and artificial intelligence

The set of working papers and authors includes:

  • Transparency Requirements for Digital Social Media Platforms: Recommendations for Policy Makers and Industry: As noted above, in this paper Mark MacCarthy outlines a transparency framework for social media platforms, “a balanced and clear legal structure for disclosure,” as preferable to a focus on content regulation and mandatory removal of objectionable content. MacCarthy argues that industry should be encouraged to adopt the transparency regulations proactively rather than waiting for legislation to be enacted. Download.
    • Mark MacCarthy, Georgetown University
  • Artificial Intelligence, Content Moderation, and Freedom of Expression: Technology is not neutral, as those who build and program systems inevitably bake in certain values. Developments in computing like artificial intelligence (AI) and machine learning can serve as both a positive and negative force on human rights and fundamental freedoms. This paper serves as a primer to the technological limitations and pitfalls of the tools that are collectively known as, or mistaken for, artificial intelligence. Download.
    • Emma Llansó, Center for Democracy and Technology
    • Joris van Hoboken, Institute for Information Law, Vrije Universiteit Brussels
    • Paddy Leerssen, Institute for Information Law, University of Amsterdam
    • Jaron Harambam, Institute for Information Law, University of Amsterdam
  • Dispute Resolution and Content Moderation: Fair, Accountable, Independent, Transparent, and Effective: Many social media companies have internal procedures to enable appeals about content that may have been wrongfully removed – or appeals about offensive content that was not removed when reported by other users. Such internal mechanisms do not meet the essential rule of law standards of a good dispute settlement system: fairness, accountability, independence, transparency and effectiveness (FAITE). This paper argues for the use of social media councils or other independent structure for policy advice or dispute resolution. Download.
    • Heidi Tworek, University of British Columbia
    • Ronan Ó Fathaigh, Institute for Information Law, University of Amsterdam
    • Lisanne Bruggeman, Institute for Information Law, University of Amsterdam
    • Chris Tenove, University of British Columbia

The full set of papers and the co-chairs report may be downloaded as a single PDF from the TWG site on IViR or from APPC.

The Transatlantic Working Group consists of more than two dozen current or former officials from government, legislatures, the tech industry, academia, journalism, and civil society organizations in North America and Europe. They search for common ground and best practices to reduce online hate speech, terrorist extremism, and viral deception without harming freedom of expression.

For a list of TWG members, click here.

Logos of Transatlantic Working Group sponsors