This year, billions of people will head to the polls in over 50 countries. Online information manipulation was present in previous elections, but will be further enabled by the proliferation of artificial intelligence (AI). Malign actors will continue to exploit new technologies to delegitimize the voting process and erode democratic values. 

Mainstream social media platforms like Facebook, Instagram, YouTube, and X (formerly Twitter) are ripe with election mis- and disinformation, but these platforms are still rolling back safeguards meant to curb harmful online content, including disinformation. Small and emerging alternative platforms, such as Telegram, Mastodon, and Bluesky, are even more under-equipped to preserve information integrity during elections. Without oversight, these platforms could be the vector of the next generation of information manipulation. However, civil society, academics, and investors can work towards open source governance, monitoring, and accountability to combat information manipulation and protect democracy.

The Specter of Emerging Platforms

Information manipulation online is an increasingly decentralized problem, as users move between mainstream and emerging platforms. This decentralization, while beneficial for access to diverse platforms, makes election misinformation more difficult to track and prevent.

Many platforms, including the world’s largest, do not have sufficient content moderation. In the lead-up to Kenya’s 2022 elections, the Mozilla Foundation found TikTok to be a conduit for fast and far-spreading political disinformation. Malign actors exploited civilians’ fear of repeating Kenya’s history of post-electoral violence to intimidate voters and exacerbate political tensions. Platforms’ content moderation can vary by area based on the size of the user base in a country. 

Other platforms fly even further under the radar. Companies based outside of the United States–like Kakao, Line, and Viber–have ample user bases centralized in specific countries and regions. However, they do not operate on the same content moderation rules as mainstream popular platforms, as their policies are developed in an ad hoc manner, reflect more open community standards, or are deficient in terms of hate speech, disinformation, or other concerns.

The structural changes at X have created opportunities for other platforms. Mastodon and Bluesky have emerged as alternatives. Emerging social-media adjacent apps like Discord, the audio-based chat app Clubhouse, Twitch, and other social networks like Gab—another X alternative—have also become more prominent and have influenced the flow and spread of information. None of these, however, have developed the critical mass of the user base of X and other legacy social media companies, meaning that the future of the online landscape will likely remain distributed across various platforms and online communities.

In the lead-up to Iran’s 2021 presidential elections, Clubhouse became a forum for Iranian electoral debates where presidential candidates leveraged the app’s online “rooms” of up to 8,000 participants to discuss their political objectives. Clubhouse’s user policies do not protect user privacy, connecting accounts with phone numbers and therefore national IDs. Iranian government actors’ use of the app showed the extent to which social media apps need to be prepared from the beginning, such as through better coordination among the teams at social media companies and ready and synchronized response mechanisms informed by strong data gathering on influence operations, trends, and malign actors.

Telegram became the dominant messaging platform during Brazil’s 2022 presidential election. Researchers at the Brazilian Internet Lab found disinformation campaigns on Telegram calling for military interventions while questioning the verified election results. Telegram only reacted to these problems after the Brazilian Supreme Electoral Court ordered its blocking, which led to a memorandum of understanding between Telegram and the government that laid out the terms of the coordination, notification of issues, and systems of response to requests to remove content under Brazilian law. 

But this outcome was unique, due to a particular confluence of factors in the Brazilian context, and is not likely to provide a repeatable model of global content moderation. So how can stakeholders engage with these and other, often smaller platforms–many of which resist engagement from civil society, the public sector, or election management bodies–on content moderation, data access, influence operations, and other related subjects? Just as the platforms themselves are now diffuse, the responses will likely need to be multifaceted as well.

Helping Platforms and Communities to Make a Positive Impact

Civil society, academics, and investors should work to ensure that social media, including alternative and small-scale platforms, have a positive impact on democratic processes rather than serving as vectors for information manipulation. 

The digital governance community, from trust and safety specialists to policy leads, to the leadership of companies, needs to foster open conversations about platform accountability and provide policy and technical oversight to ensure these new and emerging platforms’ policies and products align with democratic values. This work can support platforms even in countries where legal regulation is limited. The following recommendations are interdependent and holistic. They aim to 1) make the provision of democratic safeguards easy and less costly through open source governance 2) facilitate providing accountability mechanisms through tools for undertaking information integrity and human rights impact assessment and 3) strongly incentivize platforms by requiring them to uphold democratic norms from the beginning through investors and platforms that are essential for their operation.

Use Open Source Governance to Promote Democratic Norms

Depending on platforms’ values, priorities, resources, and business structures, it might be difficult for them to build and enforce sustainable Trust and Safety teams. The Internet community, civil society, software developers, and other platforms can create working groups to help construct open source governance. Some examples of this type of governance include Wikipedia and Reddit. For many years, Wikipedia and its policies have been partially managed through a board, including representatives for its community of moderators, editors and authors, that helps define its policies. Reddit’s community of moderators also played a major role in its development and success, although that model is at risk as the company looks to go public. 

To evaluate the success of open source governance, or if platforms aren’t interested in cooperating, those interested in democratic social media can turn to open source impact assessment. 

Academic centers such as Indiana University’s Observatory on Social Media have created dashboards that track and report on the top disinformation spreaders. Open source human rights impact assessments can be used by civil society and local communities to measure and evaluate the impact of a platform on elections and the platform’s response to mitigate online harms. These can be used to advocate for changes in policies and moderation, removing content or making it more visible, inform regulatory approaches and educate the world about what is happening in elections, political processes and societies more broadly, 

Civil Society and Multi-Stakeholder Coalitions: Raise Awareness of the Risks Posed by Under-Scrutinized Platforms

Civil society organizations can engage in systemic risk assessments and act as a network for engaging in crisis protocol management for tackling electoral issues reported by local people on the ground, such as false information about polling sites or voting results. The Meta Oversight Board, Christchurch Call to Action, and the NGO Article 19’s concept of a Social Media Council all represent versions of this idea. They provide a network that can provide some formal or informal and multistakeholder oversight to the platforms. The oversight can be with the cooperation of the technology corporations or can be a public external mechanism, a kind of audit, that could provide information about what goes wrong in these platforms.

Build an Investor Profile that Aligns with Human Rights Principles

Human rights-minded investors and donors could require start-ups to meet certain criteria based on core information integrity principles, such as a commitment to transparency around takedowns, to building strong systems for content moderation to safeguard information integrity, or ensuring the privacy of users. Having an established pool of investors would not only help accomplish this goal but allow them to diversify their portfolio. 

After a recent incident where a group of users used racial slurs as their username, X alternative Bluesky announced it would invest in its trust and safety team, recognizing it should have considered stronger policies to curb hate speech from the start. Some reporting indicated the move came after a push from investors.

OpenAI followed the principled investor model with its early board’s commitment to human rights, but with that board’s dissolution, the new board‘s commitments are less clear.

Leverage “Democracy Bots” and Encourage Community Participation

Bots are often used as quick and easy tools to spread disinformation, but they can also be used for benign purposes when given the proper programming. Civil society organizations and academia can work with software developers to build “governance bots” or “democracy bots” where democratic principles are directly incorporated into the bot design process. Democracy bots can flag inauthentic political accounts, flag deep fake content, or give information to users about moderation activities on a platform. 

Platforms can incentivize these developments by rewarding developers with recognition for meeting specific criteria and behaviors that align with set democratic principles. Building trust and safety in these programs will require cooperation from all parties, so platforms would benefit from coalescing in a multistakeholder coalition to vet and assess these democracy bots. 

For instance, Discord developed its own bot developer platform where users can create new ways for admins and communities to engage on channels. It also created a program called “Linked Roles,” where Discord community members can develop new apps for engaging with the community. Discord could incentivize the developers to uphold democratic principles. 

Encourage Coalitions to Join Existing and Develop Future Democratic Alternatives

Civil society and multi-stakeholder coalitions can advocate for alternative platforms that align with democratic values. New social media platforms should also be encouraged to join existing groups that are advocating for user rights respecting democratic principles of transparency and accountability, privacy, and good governance. 

The Global Network Initiative provides one such example. It is a multi stakeholder network of tech and social media companies, telecoms, civil society organizations like the National Democratic Institute, and academics that agree on certain principles on transparency, privacy, and freedom of expression. Members can review and even evaluate company policies to ensure compliance and call out noncompliance. Emerging companies should consider joining these mechanisms, or even forging new ones that reflect the ever evolving technology landscape, for instance to cover AI. 

Set Guidelines in App Store’s Content Governance

Tech companies that provide “curated” app stores have some power in governing the very apps that they host. Apple’s app store, for example, requires apps to comply with local regulations and obliges apps to have content governance mechanisms in place. In 2015, because of requests made by Apple and Google, Telegram added its first terms and conditions about content, requiring: “no calls for violence, no porn and no copyright infringement on public broadcast channels.” 

However, utilizing app stores as a sole enforcement mechanism for content governance rules and democratic principles has its shortcomings. App stores’ removal of an app can be disproportionate and can make the app inaccessible to its users. Imposing requirements through app stores could also lead to apps passively complying with a check-list with no real demonstration of preparedness. For example, as the Telegram CEO mentioned, when Apple and Google asked for a content policy, Telegram added the simplest terms of service theoretically possible in an app. 

Conclusion

The influence of online platforms on elections and democracy worldwide extends beyond the realm of giant tech companies. The proliferation of small and emerging platforms has made the task of governance even more daunting. As the digital community navigates this intricate landscape, a collaborative effort involving platform operators, civil society, investors, and users will be essential in shaping the future of online democracy while safeguarding the values that underpin it. 

Authoritarian countries are building their own apps, platforms and online ecosystems that do not respect these rules and will shape our online world if we do not respond with ones driven by democratic and human rights principles. There are unique allies in this world, including in the private sector: app stores, payment companies, and others that can help set the rules and online norms. By addressing the unique challenges posed by the diversity of online platforms and advocating for democratic principles, we can strive for a more inclusive and democratic digital ecosystem.