The Age of Unregulated Social Media Is Over

Now, how to avoid unintended consequences

In the United States, discussion of regulating technology companies such as Google, Facebook and Twitter is still relatively rare, and considered remarkable when it occurs. “Media Executives Talked About Facebook Regulation In Private Meeting,” reads a headline on BuzzFeed amid disputed reports the topic came up at a meeting of the publisher trade group Digital Content Next in early February. An anonymous source claims “one executive at the meeting likened Facebook to Big Tobacco.”

Perhaps in some corners discussion of putting constraints on these monolithic companies still startles. But it is increasingly evident that across the Atlantic lawmakers are preparing to act. The reason for action includes the rising recognition of social harms produced on these platforms and their failure to regulate themselves. Those social harms–from Russian government hidden propaganda, to terrorism and extremism content, to fake news–threaten the fabric and security of modern democracies.

Last week, U.K. Members of Parliament traveled to the United States to meet with experts on questions at the intersection of technology, media and democracy ahead of a day receiving testimony from technology executives in Washington DC. Dubbed the “Inquiry on Fake News,” the panel produced seven hours of pointed — sometimes heated — discussion on issues ranging from the role of companies like Facebook and Twitter in enabling propagandists, to questions about how recommendations systems can be gamed by bad actors, to the problems of algorithmic bias.

Despite little clarity from either the politicians or the executives on the specifics of what should be done, one thing was abundantly clear: as far as the House of Commons members are concerned, the age of unregulated social media is over. Consider the tone of these pointed questions:

Rebecca Pow, a Conservative MP from Taunton Deane, to YouTube’s Juniper Downs: “I wonder whether you think the description of what you are and the whole name of your platform ought to be changed so that you take on more of the responsibilities of a bona fide newspaper and you have to apply broadcast and newspaper regulations to yourself… at the moment, you are unregulated.”

Simon Hart, a Conservative MP Carmarthen West and South Pembrokeshire, to Facebook’s Simon Milner: “You will have seen this week that the PM said that online platforms ‘are clearly no longer just passive hosts of the opinions of others’ and that she believes it is right that we look at the liability of social media companies for illegal content on their sites. That reflects comments made in other parts of the world as well. Do you read that as being that the age of unregulated social media is actually coming to an end?

Brendan O’Hara, Scottish National Party MP for Argyll and Bute, to Facebook’s Monika Bickert: “Are you happy with the regulatory framework? I am not talking about self- regulation; I am talking about a regulatory framework. Are you happy with the regulatory framework as it currently exists? Given that the debate is now happening, where do you see that debate going to the benefit not just of Facebook, but of society in general?”

Julian Knight, a Conservative MP from Solihull, to YouTube’s Juniper Downs: “…why has your self-regulation so demonstrably failed, and how many chances do you need?”

When asked what the technology companies fear during expert testimony later in the day, Frank Sesno, Director of the School of Media and Public Affairs at The George Washington University, had a simple answer:

Regulation. They fear that they will not be what they profess to be, which is technology companies and not media companies. They fear that they are going to be held to account for the content that they say they are merely facilitating and not producing. The most poignant observation is that they have this very strange, powerful, hybrid identity as media companies that do not create any of the content but should and must—to their own inadequate levels—accept some responsibility for promulgating it. What they fear most is regulation—a requirement to turn over their data. They fear that there will be Government regulators at different levels overseeing their businesses and that they will not be able to be the independent mega-corporations with the mega-revenue that they now generate.

In line with Sesno’s remarks, each of the technology executives pushed back on the British lawmakers, arguing that their companies are hardly unregulated. And while it is true that they are subject to a variety of data protection laws and must comply with a panoply of laws in hundreds of countries that govern questions such as how they work with law enforcement, they are still hardly accountable for the sorts of externalities we are seeing today. It seems right that democracies should demand more be done to address the scale of misinformation, propaganda, hate speech, dark political advertising and other vile content that flows freely across the platforms. Simon Hart referred to “regulation that is accountable, democratic and transparent.”

The running theme to these exchanges with the parliamentary members is clear- it isn’t whether these companies should be subject to further regulation, but rather how, and with what goals in mind? In the UK and Europe, these questions are gaining steam. In the United States, it is time for the conversation to come out of the back room and into the public square. The British inquiry was in stark contrast to the November 2017 hearings on Capitol Hill, where House and Senate Intelligence Committee members never once uttered the word “regulation” while questioning Google, Facebook and Twitter’s attorneys.

Now is the time for these discussions to unfold, lest we end up with hasty rules that produce unintended consequences. Across the country, on university campuses, at industry conferences, and other  public forums, we urgently need to frame and discuss these issues. Frank Sesno summed up what is at stake: “What is very powerful and very prevalent now is to make this conversation as stark as it is and to put online what is on the line, which is whether we are going to have an informed or deformed public discourse and public process, and whether those companies are contributing to or subtracting from the democratic health that we value.”

Here are three areas where regulation could create useful mechanisms:

1. Greater transparency to governments and independent researchers

Right now, the technology companies operate with little scrutiny of their innerworkings. It’s crucial that there is more transparency–both to government and to independent researchers who can help society to understand the consequences of these vast new communications platforms. This means access to data and to systems. This is complicated, but it is necessary if we are to ensure these technologies are, at the very least, a net benefit to society going forward. Discussions should look at how to create frameworks and mechanisms that permit such scrutiny to occur, “opening up the box” as MP Paul Farrelly put it. University researchers, for instance, can serve as a powerful partner to governments and technology companies in understanding these platforms and how society interacts on them.

2. Accountability to citizens

David Carroll, a researcher at The New School, points to “the importance of the British and European data protection model and the idea of a legal subject and a legal controller, which forms the basis of creating transparency.” These types of structures allow users to seek and receive information from technology companies, something that might raise the level of user trust. To Carroll, there is a “significance of being able to understand what your data that you get means, how it shapes your experience and how it can be an understandable piece of data in an understandable interface.”

Further, citizens should have the right to know when the technology companies make a mistake, or when breaches occur such as the Russian election interference campaign in 2016. How to regulate such disclosures and what should be required in the form of robust consumer protections is an area ripe for discussion.

3. Responsibility for addressing externalities

Facebook’s market cap isn’t far from that of ExxonMobil. Any company that has reached such scale produces some form of pollution or other negative externalities. Often, regulators seek to make industry pay for it. While Google and Facebook have each invested in some initiatives designed to address these kinds of externalities– particularly their impact on the news media- clearly much more is needed. Governments should explore the appropriate levy to place on these technology companies, and what types of activities to finance.

As possible frameworks are explored, it is crucial that the technology companies are at the table. “I do not think that we should have state intervention that may be a knee-jerk response and does not react to the realities and challenges that come from these platforms at a scale that is hard even to imagine,” argued Claire Wardle, a research director at Harvard’s Shorenstein Center who joined the hearings. “I want the platforms to be part of the conversation, so that we can have an honest look at the issue. We never saw this coming. We did not think that, in 2018, we would be where we are today. We should not be the only ones who make these decisions and they should not be the only ones.” Facebook’s Monika Bickert agrees: “We would certainly want to be part of the dialogue, because from time to time we do see legislation that results in some unintended consequences that are not good for anybody,” she said.

It’s time these discussions start in earnest. Ultimately, it’s in the companies’ interest. As Kinsey Wilson, an executive at The New York Times told the MPs, “in addition to regulation and possible trust action, it is also the prospect that what they have built is to some extent getting out of their control. Their reputation, if you look at how they score–generally Google and Facebook enjoy pretty good reputations–but that is taking a hit.” Comparisons to “Big Tobacco” may yet be a stretch, but it is time for democracies–and the tech companies–to come together and find a path forward. The scale of these new platforms and the dramatic speed at which they have grown to dominate how billions of people  get news and information has redefined the public sphere.

The age of unregulated social media may be over, but thoughtful regulation is in the interests of everyone.

(Drew Angerer/Getty Images)


About the Author(s)

Justin Hendrix

Executive Director of NYC Media Lab. Opinions expressed here are entirely his own. Follow him on Twitter (@justinhendrix).