Tech Giants Crack Down on Influence Campaigns

These operations aimed to influence public debate on critical issues, including the general elections in India and matters concerning the Sikh community
Tech Giants Crack Down on Influence Campaigns

In a significant move highlighting the increasing prevalence of manipulated content targeting India, two major tech companies, Meta and OpenAI, have taken substantial actions against accounts engaged in covert influence operations. These operations aimed to influence public debate on critical issues, including the general elections in India and matters concerning the Sikh community.

OpenAI’s Actions Against Israeli Campaign

 OpenAI, the US-based artificial intelligence company known for developing ChatGPT, disclosed in its recent report that it disrupted a covert influence campaign orchestrated by an Israeli firm. This campaign used OpenAI’s models to create fake social media personas and generate content related to the Indian elections, including anti-Bharatiya Janata Party (BJP) material. This content was disseminated across multiple social media platforms. The report, titled ‘AI and Covert Influence Operations: Latest Trends,’ is OpenAI’s first release on this subject. The company detailed that the Israeli political campaign management firm, STOIC, was behind this operation. STOIC was generating content concerning the Gaza conflict, the Histadrut trade unions in Israel, and the Indian elections. The content created by this network was posted on platforms like X (formerly Twitter), Facebook, Instagram, and YouTube. The operation employed AI models to create fictional social media personas and conduct research on individuals in Israel commenting on the Histadrut trade union. OpenAI emphasized that their models did not supply personal information in response to such prompts. The AI-generated content included profile pictures created using generative adversarial networks (GAN), easily downloadable from the internet. Despite these sophisticated techniques, the campaign saw low levels of engagement

Meta’s Actions Against Chinese Network

Simultaneously, Meta, which owns Facebook, Instagram, and WhatsApp, reported the removal of several accounts, pages, and groups violating its policy against “coordinated inauthentic behavior.” These accounts originated in China and targeted the global Sikh community, including members in Australia, Canada, India, New Zealand, Pakistan, the UK, and Nigeria. Meta’s report highlighted that it removed 37 Facebook accounts, 13 pages, five groups, and nine Instagram accounts for coordinated inauthentic behavior. The network from China targeted the Sikh community globally and was linked to earlier disruptions by Meta of similar networks manipulating public debate. Meta detailed that this activity spanned multiple social media platforms and involved clusters of fake accounts. One such cluster had links to an unattributed coordinated inauthentic behavior (CIB) network from China, targeting India and the Tibet region, which Meta had disrupted in early 2023. 

Shared Efforts and Industry Cooperation

OpenAI noted that it shared threat indicators with peers across the tech industry. Despite these campaigns’ sophisticated use of AI, they did not achieve significant audience engagement or reach through OpenAI’s services. In May, OpenAI disrupted the activity focused on the Indian elections less than 24 hours after it began. The disrupted network began generating comments criticizing the ruling BJP party and praising the opposition Congress party. OpenAI’s proactive measures underline the company’s commitment to preventing the misuse of its AI models for covert influence operations. Meta’s swift action against the Chinese network reflects its ongoing efforts to curb coordinated inauthentic behavior on its platforms. By identifying and removing accounts involved in such activities, Meta aims to maintain the integrity of public discourse on its social media platforms. 

These actions by OpenAI and Meta underscore the critical role of tech companies in safeguarding democratic processes and public debate from covert influence operations. By disrupting these networks, the companies have demonstrated their vigilance and commitment to maintaining the authenticity of online content, particularly concerning sensitive political and social issues. The continued cooperation among industry peers is essential in tackling such sophisticated and coordinated threats effectively.