Perspectives

Putting A Thumb On The Market: The Rise Of State-Aligned Platforms From Repressive Contexts

FOTN

This article initially appeared on the National Endowment for Democracy's Power3.0 blog.

Discontent with Elon Musk’s Twitter has led many to seek out an alternative. Some rushed to Mastodon, while over 500,000 Brazilians flocked to the India-based Koo over two days in November. But even before Twitter was put under new ownership, a larger exodus from U.S.-based platforms had been unfolding.

The growing interest in alternative, emerging platforms reflects people’s interest in—and the very real need for—a more diverse market. It also exemplifies another insidious trend: governments are increasing their authority over traditional tech giants to coerce them into censorship and surveillance. And when those companies resist, officials prop up domestic alternatives that are more pliant to their whims. Freedom House’s latest publication, Freedom on the Net: Countering an Authoritarian Overhaul of the Internet , outlines how governments are breaking apart the global internet to create siloed information spaces that they can more easily control. Bolstering state-owned or -aligned homegrown platforms is a key pillar in this strategy.

The Risks Of State-Affiliated Platforms

India’s ruling Bharatiya Janata Party has embraced Koo. The BJP’s engagement drove its popularity in the country after Twitter refused government demands to censor the accounts of journalists and activists during large protests against proposed agricultural reforms in 2021. Koo is privately owned and not affiliated with the state; however, researchers have dubbed it a “nationalist Twitter.” Koo’s co-founder claimed that the platform would abide by local law and adhere to government demands. Herein lies a major concern: simply “following local law” can mean silencing political and social speech when that country hosts censorial legal frameworks.

Particularly for those based in authoritarian contexts, companies owned by or that have close ties to state actors are more willing to comply with censorship and surveillance demands and are more susceptible to becoming vehicles for state disinformation. State-affiliated platforms are often less transparent in their operations and policies, and may be better shielded from civil society advocacy, media investigations, democratic oversight, and other forms of public scrutiny. And if these alternatives grow in popularity, the political cost for blocking international platforms diminishes. This censorship limits essential civic spaces for people to connect with loved ones based abroad, organize politically, and hold the powerful to account.

A Problem Gone Global?

The Chinese Communist Party (CCP) has been the most successful by far at fostering domestic platforms beholden to itself. Systematic censorship of foreign apps, beginning over a decade ago, combined with robust investment in a local tech sector created fertile conditions for the emergence of platforms like WeChat and Douyin (the Chinese version of TikTok). These platforms now dominate not only in China, but also among the country’s large diaspora population.

Iran’s government has followed the CCP’s lead, using censorship and arduous regulation of foreign apps to boost its domestic platforms, yet with far less success. Recently, one lawmaker admitted that a new draft User Protection Bill intends to incentivize Iranians to stop using Instagram, one of the only foreign apps that remained accessible prior to the current anti-government protests .

In Russia, following the government’s brazen invasion of Ukraine, government blocks on Facebook, Twitter, and Instagram drove people to VK and Odnoklassniki, domestic platforms run by a parent company partly owned by Putin allies. Local search engine Yandex, a Google alternative, has allegedly hid coverage of the invasion from its homepage. Moreover, officials this year reportedly offered to pay YouTube and TikTok influencers if they switched to alternatives RuTube and Yappy, and walk the tightrope of the Kremlin’s editorial line.

Heightened interest in domestic alternatives has emerged partly due to policy decisions by international platforms. Ethiopia’s Information Network Security Agency announced last August that it was developing a new platform to compete with Facebook, Twitter, WhatsApp, and Zoom, after complaining that foreign companies meddle in domestic politics and that Facebook’s content moderation obscures the “true reality.”

Several Turkish state officials transitioned their own public WhatsApp accounts to the messaging app BiP, citing the need to do so after WhatsApp introduced concerning privacy updates in which more data would be shared with parent company Meta. The app, which has growing popularity in Bangladesh, Indonesia, Pakistan, and Bahrain, is owned by the mobile operator Turkcell, which is controlled by the state’s sovereign wealth fund. Uzbekistan’s government also announced the creation of a domestic social media service, couched as necessary to “protect privacy” despite the government’s own egregious record.

Fostering A More Diverse Market With Responsible Platforms

Despite the risks to human rights they present, simply blocking alternative platforms is an even more profound restriction on free expression and access to information. Instead, democratic governments should set rights-based standards for all social media services, regardless of whether they are state- or privately-owned.

Democratic policymakers should pass and regulators should enforce comprehensive privacy laws that minimize the data companies collect and limit how it is shared with both state and private actors. Privacy laws should also regulate what personal data can be fed into the platforms’ recommendation systems. Such legislation could limit the extent to which people can be microtargeted with ads or suggested posts based on certain characteristics which, in turn, could limit the reach of state propaganda campaigns that rely on these systems.

Regulations bolstering transparency would also shed light on company operations. For example, laws can provide opportunities for vetted researchers to access data from large platforms, as the EU has done with its Digital Services Act. Strengthened transparency can provide insights for future policy development, spur legislative scrutiny over company practices, and inform civil society’s research and advocacy.

Democracies should also invest in competition policy. A more diverse market can incentivize a race to the top—and not to the bottom—when it comes to promoting democratic values. Such a market could encourage companies to protect fundamental rights and address challenges like disinformation, and not run away from those vital efforts or simply do the bare minimum. For instance, there is growing appetite for better privacy standards: in 2021, 83 percent of American voters wanted a federal data protection law. Competition policy could benefit smaller or community-oriented platforms that may struggle to compete, such as Ahwaa which serves the marginalized LGBTQ+ community in the Middle East. Policymakers should consider introducing data-portability and interoperability provisions, the latter of which would allow people to choose services that best match their needs while still being able to communicate with family and friends across platforms.

New companies should implement lessons learned from the mistakes of today’s giants, including best practices in transparency, content moderation, and trust and safety. Such best practices include resisting government demands to censor content or hand over personal data, engaging continuously with civil society in the countries in which they operate, and incorporating content moderation standards known as the Santa Clara Principles .

There’s no silver bullet to reducing harms perpetuated by social media. But homegrown alternatives from authoritarian contexts will exacerbate the challenge and fuel new risks to human rights.