How foreign influence campaigns manipulate your social media feeds
Russians, Chinese, Iranians, and Israelis are trying to change your beliefs.
Foreign influence campaigns, or information operations, have been widespread in the run-up to the 2024 US presidential election. Influence campaigns are large-scale efforts to shift public opinion, push false narratives, or change behaviors among a target population. Russia, China, Iran, Israel, and other nations have run these campaigns by exploiting social bots, influencers, media companies, and generative AI.
At the Indiana University Observatory on Social Media, my colleagues and I study influence campaigns and design technical solutions—algorithms—to detect and counter them. State-of-the-art methods developed in our center use several indicators of this type of online activity, which researchers call inauthentic coordinated behavior. We identify clusters of social media accounts that post in a synchronized fashion, amplify the same groups of users, share identical sets of links, images or hashtags, or perform suspiciously similar sequences of actions.
We have uncovered many examples of coordinated inauthentic behavior. For example, we found accounts that flood the network with tens or hundreds of thousands of posts in a single day. The same campaign can post a message with one account and then have other accounts that its organizers also control “like” and “unlike” it hundreds of times in a short time span. Once the campaign achieves its objective, all these messages can be deleted to evade detection. Using these tricks, foreign governments and their agents can manipulate social media algorithms that determine what is trending and what is engaging to decide what users see in their feeds.
What's Your Reaction?