The Christchurch Call is the tip of a bigger iceberg

Although curbing the proliferation of extremist and terrorist views through social media and other online platforms is understandably a priority, there are many more regulatory issues that also need addressing, writes Dr Peter Thompson from Victoria University of Wellington's Media Studies programme.

It is no mean feat of statesmanship to bring together 17 governments and the major global tech companies at an international summit. New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron deserve credit for initiating the Christchurch Call summit and coming away with a multilateral pledge to implement measures controlling online extremist/terrorist content.

However, as Ardern herself has readily acknowledged, the pledge is non-binding and represents a starting point, not a resolution of all the issues the live-streaming of the Christchurch mosque attacks have brought into focus. Although curbing the proliferation of extremist/terrorist views through social media and other online platforms is understandably a priority in the wake of the attacks, it is really the visible tip of a much larger regulatory iceberg.

The phenomenal growth of social media and digital intermediary companies has radically altered the shape and structure of the media ecology, undermined traditional business models and reshaped the way we interact socially (and sexually), go shopping and consume news and entertainment. Some of these changes are positive, but they also bring new risks and policy problems the current regulatory frameworks are ill-equipped to deal with.

Publishers or pipelines?

Although there have been calls to regulate digital intermediaries and platform operators as if they were publishers of content, this is far from straightforward. Media companies such as Facebook and Google have come to play an increasingly influential role in enabling content discovery and sharing. Their online architectures and algorithms are functionally distinct from both the telecommunication infrastructures (pipes) and the production and publishing of the content that gets circulated through online platforms. Such operations are not easily accommodated into existing regulatory frameworks for either telecommunications or content standards.

Previous attempts to address regulatory gaps, notably Labour’s 2008 Review of Regulation and National’s 2015 Exploring Digital Convergence initiatives, have not come to fruition because of changes in government and policy agenda. The Christchurch Call has opened up a rare opportunity to bring government, industry and civil society together to look for solutions to a range of policy issues, including:

- The potential for digital intermediary algorithms to engender filter-bubbles or echo chambers of self-reinforcing extremism.

- Privacy issues related to the collection and sharing of personal data without user consent.

- The proliferation of fake news/propaganda targeted though social media using personal data in order to manipulate voting behaviour (e.g. Cambridge Analytica’s dubious role in Donald Trump’s Presidential campaign).

- Digital intermediary dominance over the means of online content discovery and capture of advertising revenue from traditional media (which actually provide the content).

Beware of social media companies bearing gifts

Multilateral solutions are obviously desirable, especially when dealing with global media companies. However, it would be wrong to assume only supra-national level agreements are worth pursuing for a small country like New Zealand. On the contrary, as Ardern has duly acknowledged, the principles and actions agreed in the Christchurch Call Pledge still need to articulated and implemented on a domestic level.

The real acid test will come when regulatory measures are proposed that the digital intermediaries oppose. It is one thing to get them to sign up to non-binding aspirations, but quite a different matter to implement regulatory interventions that protect the public interest but restrict their operations or reduce profits. For example, Facebook has been criticised for dragging its heels over the removal of right-wing extremist material in the UK and the US primarily because they were generating lucrative online traffic. Although Facebook has recently moved to ban white separatist/nationalist content, YouTube and Twitter have yet to follow its lead. There is clearly a tension between shareholder and civic responsibilities here.

Nevertheless, not all regulatory measures are unwelcome. In a recent Washington Post commentary, Facebook’s Mark Zuckerberg concurred with criticisms that Facebook had too much control over online speech and suggested regulatory guidance would be welcome. This is partly self-motivated, however; independent regulation would largely absolve Facebook of the responsibility for adjudicating what sorts of material violate community standards and having to constantly justify decisions to remove or allow content. Regulation could also help reduce legal ambiguities over Facebook’s practices and thereby reduce its exposure to regulatory risk and prosecution.

Scoping the regulatory response

The willingness of social media and digital intermediaries to engage in discussions about regulation is positive, but their participation in forums like the Christchurch Call will also reflect their interest in ensuring policy options that threaten profits are quarantined in the ‘too hard’ basket. It is therefore incumbent on policy makers to take account of civic and democratic priorities and not allow vested interests to determine the scope of any ensuing deliberations.

Developing functional policy responses then requires consideration of four key issues:

1. Clear definition of the policy issue and objectives (generic calls to regulate all social media are obviously too vague, but focusing too narrowly on, say, live-streaming of terrorism could overlook the underlying structural issues in the media ecology).

2. Interventions need to focus on the appropriate part of the value-chain (e.g. vetting access to live-streaming in advance of distribution versus monitoring content post-distribution).

3. Designation of an appropriate regulatory agent (industry may self-regulate its own content well enough but imposing financial penalties for violations might require an independent regulator with statutory powers).

4. Workable, proportionate mechanisms capable of delivering the desired outcome and avoiding undesirable/unintended outcomes (e.g. if it was decided references to the Christchurch terrorist were unacceptable on social media and an algorithm was designed to trawl through online exchanges deleting such material, this could remove legitimate discussion of how to prevent terrorism and shut down pages of anyone called Tarrant).

A fuller discussion of possible policy responses is covered in the Better Public Media discussion paper Beyond the Christchurch Call.

Read the original article on Newsroom.