The bungled vote count at the Iowa caucus last month revealed the blazing incompetence of that state’s Democratic Party and Shadow Inc., the contractor it hired to design a vote-counting app. But it also revealed something far more troubling: deep suspicion and pervasive anger. Almost immediately after the announcement that results would be delayed, unfounded allegations proliferated on Twitter. Even blue-check Twitter users—people with verified identities and, often, affiliations with credible media institutions—quickly resorted to conspiratorial speculation about nefarious plots. Several high-profile Sanders surrogates claimed that the party was stalling because it was unhappy that results showed Bernie Sanders winning; others went a step further, suggesting that local party apparatchiks were outright rigging results for Pete Buttigieg. Some of these insinuations were retweeted by high-profile social-media accounts, including that of a sitting member of Congress.

Iowa wasn’t a one-off: After Joe Biden’s surprisingly strong performance in Tuesday’s primary, the hashtags #RiggedPrimary and #RiggedElection began trending on Twitter.

The key lesson from 2016 isn’t that Russia ran an online manipulation operation; it’s that, on an internet designed for sensationalism and virality, influence itself has evolved. When propaganda is democratized, when publishing costs nothing, when velocity and virality drive the information ecosystem, and when provocateurs face no consequences, literally everyone has the power to promote conspiracy theories and other forms of disinformation. Today, everyone is on alert for outside agitators ginning up unrest. But the most divisive activity in American politics is overwhelmingly homegrown.

I was one of the researchers who investigated the Internet Research Agency’s social-media manipulation tactics from 2014 to 2017; my team and I reviewed 10.4 million tweets from 3,841 Twitter accounts, 1,100 YouTube videos from 17 channels, 116,000 Instagram posts from 133 accounts, and 61,500 unique Facebook posts from 81 pages. Strikingly, only about 10 percent of the content that Russian trolls circulated during the three-year period was overtly political to the point of mentioning specific candidates; the rest was intended to galvanize people around group identities, to exacerbate distrust, and to sow social divisions around fundamental questions of who and what America is for.

READ  Facebook Shops Could Be Huge - The Motley Fool

Even in the 2016 influence operation, many of the conspiratorial and hyper-partisan tweets and memes that the trolls selected to power their outrage machine were created by Americans. The Internet Research Agency simply amplified them by reposting or rebranding them. Indeed, appropriating real content allowed the Russian meddlers to operate subtly—to the point that the extent of their influence stayed concealed for a full year after the 2016 election. Yet while Moscow’s trolls had convincingly pretended to be something they weren’t, other bad actors—most notably the Islamic State—had already quite visibly demonstrated the power of computational propaganda on social networks. This kind of manipulation was already becoming the new normal, and no one had any idea what to do about it.





READ SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here