Over the next few months voters in different countries in the world will have elections of very great importance. This period is however faced with a potential threat arising across their borders. The latest from Microsoft joining noted that China has stepped up the pace of using highly advanced AI practices and information to mislead others about what is actually happening and to meddle with the democratic process of other countries.

China Ramps up AI

When it comes to election meddling, even though it has been around for a long time, China’s version which employs methods such as deepfakes that are enabled by advanced AI is posing serious challenges that threatens to distort the integrity of the electoral processes. On September 15, 2022, the Microsoft Threat Analysis team found out that besides the Chinese government itself, its affiliates have also begun the process of AI-generated content deployments to spread their own narratives during foreign elections – with India, South Korea and the United States being some prime countries that they are focusing on.

In the course of the 2020 United States presidential election, social media accounts, linked to China and remotely working, actively polled American voters on the filters through which divisive domestic issues were addressed. Through studying these controversial wedge issues, they tried to define them and found the loopholes and exploited the divisions in society. From what Microsoft said, such an intelligence gathering was likely distinctly used to tailoring the particular voters group thus making the swing of voting in a particular favor.

More concerningly, around Taiwan’s January presidential election, Microsoft observed China deploying AI-generated deepfakes and manipulated media for the very first time. Deepfakes use sophisticated neural networks to synthesize fake videos, photos and audio that appear realistic to the human eye and ear. By pumping out deepfakes depicting opposition leaders in compromising or scandalous situations, China likely hoped to de-legitimize their campaigns and influence the outcome of Taiwan’s democratic process. 

With India also gearing up for nationwide elections in 2024, Microsoft warns that similar tactics could be employed. By churning out AI-generated memes, videos and audio tailored to hot-button issues in India, China aims to inflame social tensions and divisions among voters. Like the US, these operations would gather intelligence on what divides Indians most in order to precisely target disinformation where it cuts deepest. 

While the immediate impact of such campaigns on actual election results remains difficult to determine, Microsoft notes that their effectiveness will likely increase over time as AI capabilities continue advancing at a rapid pace. Deepfakes and AI manipulation are already becoming more sophisticated, deceptive and harder to detect with each passing year. If left unchallenged, there is grave potential for such techniques to undermine the integrity of democratic elections and institutions worldwide.

Beyond elections, Microsoft’s report also sheds light on China’s expanding use of AI for geopolitical goals across multiple fronts. North Korea has notably used cryptocurrency hacks and supply chain attacks funded through AI to advance its military and intelligence objectives. Even Russia, while not mentioned in the report, has emerged as a serial purveyor of disinformation campaigns often leveraging deepfake technology.

As the world’s leading democracies prepare to head to the ballot box amid rising authoritarian influence worldwide, safeguarding the integrity of the democratic process from AI manipulation is a challenge that cannot be ignored. Steps must be taken by governments, tech companies and independent fact-checkers to build up defenses, detect deepfakes, trace disinformation back to its source and educate the public. With the midterms already heating up in the US and elections looming in India, the stakes have never been higher to counter these emerging threats to global democracy from AI. The consequences of inaction could shake the very foundations of free and fair elections.