NO matter what the outcome of the US election tomorrow, there will be one clear loser: Facebook. The social media giant knows that irrespective of electoral developments, its role will be criticised, subjected to further review, and possibly antitrust legislation. The fact that a social media platform is one of the biggest stories of this surreal US election holds important lessons for aspiring democracies everywhere.
The concern is that Facebook has not acted quickly or effectively enough to ensure the platform does not mar electoral integrity. Tap dancing along the line between platform and publisher, neutrality and editorial judgement, Facebook is open to criticism from all. Democrats say it has spread misinformation and facilitated voter suppression; the Republicans accuse it of censorship.
Facebook’s problems are not limited to the US. Its Indian head of public policy resigned last week following allegations that she prevented the banning of BJP leaders for posting anti-Muslim hate speech. A Delhi government peace committee previously questioned the platform’s role in fuelling communal riots in the city earlier.
There is no winning for social networks. Clamp down on hate speech and they’re branded anti-free speech; stem misinformation and they’re engaging in censorship; comply with local government regulations and they’re facilitating authoritarianism; protect free expression and they’re promoting blasphemy. This problem stems from governments’ and organisations’ cynical response to the growing power of social networks. It has not been to introduce checks and balances, rather to ask: how can I make this work for me?
In truth, they are on the side of profits.
The Pakistani state has been grappling with this question through most of this year. It is pushing ahead with the Citizen Protection (Against Online Harm) Rules 2020, despite pushback from civil society, media and international tech giants. Last month, the cabinet approved requirements for tech companies to block content within 24 hours of receiving an official request, prevent live-streaming of ‘objectionable’ content, label content ‘false’ on the basis of government directives, and maintain data servers within the country. In other words, under the guise of regulating new media platforms, the government is trying to control all digital content.
Neck-deep in political and communal quagmires, social media platforms try to claim neutrality, or claim to be on the side of ‘freedom’ or ‘openness’. In truth, they are on the side of profits. Facebook, for example, faces allegations by digital researchers that it profits most from hyper-polarising content because it generates the deep engagement that advertisers are willing to pay for.
Pakistan’s efforts at digital control are so far being stymied by the Asia Internet Coalition. Earlier this year, they threatened to exit Pakistan if the restrictive legislation was introduced; last month they again expressed concerns about the government’s proposed approach to social media policing and censorship. These honourable actions are likely enabled by the fact that Pakistan is a significant but not make-or-break market for international tech companies; their exit would primarily be our own economic and social loss. Meanwhile, Pakistani netizens are at their mercy.
In bigger markets, social media platforms’ behaviour is largely governed by commercial considerations. Facebook’s stronger policies on hate speech, voter interference and misleading political advertising for the US presidential elections probably stems from Unilever’s decision this summer to suspend advertising on the platform. The multinational company was responding to the ‘Stop Hate for Profit’ campaign, which called on companies not to advertise with platforms that were not proactively stemming misinformation and hate speech in the context of the Black Lives Matter movement, and beyond.
But is it any better to have private corporations be the arbiters of democratic free speech than social media platforms? Ultimately, the only protection against the cynical co-option of social media platforms (and other new technologies that will inevitably follow) by all stakeholders — whether states, corporations, right- or left-wing movements — is a throwback to good, old-fashioned democratic tenants: free but regulated media, functioning independent courts, law enforcement, transparent public policy.
In Germany, Facebook and other social media platforms are required to report criminal or infringing content (which would include hate-inciting or defamatory content) to the police. Platforms are fined when they fail to do so. Beyond that, any review or punitive action of the online transgression is handled through routine investigation and legal due process. The governance of digital matters need not be exceptional, just robustly democratic. And for that, we need to shift the focus from platforms to the societies and systems in which they flourish.