This week, Facebook hit the kill switch on news for the 17 million Australian users of its platform. Its action was in response to a proposed law advancing through the Australian Parliament that aims to level the playing field for traditional media organizations in the online environment. But Facebook did not just block traditional media outlets. Thanks to the overbroad definition of “news,” Facebook Pages ranging from hotlines for survivors of sexual assault and domestic violence and Suicide Prevention Australia, through to the national weather service went dark.

There is plenty to critique in the proposed Australian law. Facebook claims the law “misunderstands” the company’s relationship with news organizations. Certainly, the law would force Facebook to change aspects of its business model. It would require the company to pay news organizations for content displayed on the social media platform – with the stated goal of addressing the power imbalance between Australian news media businesses and major digital platforms. Digital rights activists, however, make a compelling argument that, despite claiming to support journalism, the law as written arguably risks doing little more than shift advertising profits — derived from extracting users’ personal data – from big tech shareholders to big media shareholders. And this view is consistent with the influence that media mogul Rupert Murdoch had in advancing the inquiry that laid the groundwork for the law.

Nonetheless, the proposed law arises from a two-year process of public consultation and has support from parties across the political spectrum. While Facebook had threatened this drastic move last year, it implemented the news blackout without advance warning, and in a heavy-handed way that left some of Australia’s most vulnerable communities reeling.

As one example, it is bushfire season in Australia. This time last year, fire raged across 40 million hectares of land, destroying thousands of homes and killing scores of people. One of the key ways that fire warnings reach remote communities is through local fire department Facebook Pages, many of which disappeared this week as a result of Facebook’s action.

Another illustration of the irresponsibility of Facebook’s action stems from the fact that Australia is about to launch its Covid-19 vaccination campaign. Next week, vaccines are set to be administered to high risk groups, including in indigenous communities beset with mistrust of the Australian healthcare system due to decades of systemic racism. Many in these communities trust information shared only through indigenous media groups, including those who publish on Facebook Pages that are now blocked. As the Chair of Australia’s First Nations Media, Dot West, put it, “Never has our media been more vital than during a global pandemic, especially on the cusp of vaccination rollouts.”

For traditional media with nationwide reach, Facebook’s draconian response to the proposed law may well be a boon. Within hours of Facebook’s action, the Australian Broadcasting Corporation displayed a banner across all its news articles: “Missing our news on Facebook? Get the latest news….with the ABC News App.” Australians can of course go directly to traditional media, or download the app for any given publication, instead of accessing news through Facebook’s platform. And Facebook’s actions this week may lead more Australians to bypass the platform altogether going forward.

But these potential upsides for traditional media organizations do not outweigh the immediate harm done to hundreds of community groups and local and non-profit media outlets who have built up their audiences on Facebook itself. Convinced by Facebook’s pitch that its platform enables marginalized groups to bypass the gatekeeping function of major media players, these groups have devoted thousands of volunteer hours and more to building Pages that Facebook unilaterally shut without notice. One support group for survivors of sexual abuse by clergy had five years’ worth of work on its news archive vanish, literally overnight.

In response to the outcry over the collateral damage caused by Facebook’s action, some speculated that the shuttering of vital public service information may have been the result of algorithmic blocking that had been unable to distinguish news from other content. Facebook, however, acknowledged that its decision to block content using an overbroad definition of news was, in fact, intentional. And it justified the decision in reference to the (yet to be passed) Australian law.

“As the law does not provide clear guidance on the definition of news content, we have taken a broad definition in order to respect the law as drafted,” a company spokesperson explained. “Respect” was an odd word choice in this context. Any company should, of course, respect the law, despite disagreements with it. But there was no obligation for Facebook to “respect” a proposed piece of legislation. It was simply a choice on their part to implement their blackout in this way.

Facebook’s spokesperson added that the company would reverse the blackout of Pages that were “inadvertently impacted,” which suggests that algorithmic error may also have been part of the story. But the company gave no indication of the process by which Pages caught up in the purge would be restored, leaving community groups unclear about whether they had to make individual petitions to the U.S. based company, or whether this was an issue that Facebook would undertake to correct itself. To date, some non-news Pages have been restored and others have not.

Beyond the immediate havoc the blackout has caused for Australians, Facebook’s decision to protest the proposed law in this manner has ramifications for Facebook users worldwide. First, for Facebook users in a functioning democracy, the Australian blackout shows that Facebook is willing to treat people, including marginalized populations, as pawns to advance Facebook’s own vision of what a law should look like. The company does not appear to view the citizenry of a country with due respect to shape their own laws – even if misguided ones — through a democratic process. At a minimum this makes a lie of the many public speeches by Mark Zuckerberg proclaiming his vision of Facebook as a tool to strengthen democratic ideals.

The lesson for Facebook users who have no say in the way their state regulates online content is even more disquieting. As the Myanmar military systematically pushed anti-Rohingya propaganda out on Facebook’s platform to fuel its genocidal campaign, Rohingya survivors and their allies urged Facebook to remove the inciting content. Facebook has since acknowledged it was “too slow to prevent misinformation and hate” on its platform in Myanmar. But the company’s explanations – focused mainly on its technical and linguistic limitations in distinguishing hateful propaganda from lawful content — are now cast in a different light. As the Australian blackout shows, Facebook is willing and able to react at lightning speed to remove even unquestionably lawful content from its platform when its business model is at stake.

Photo credit: In this photo illustration the 9News Facebook site is seen blank, on February 18, 2021 in Melbourne, Australia (Robert Cianflone/Getty Images)