Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Tracking Shows Russian Meddling Efforts Evolving Ahead Of 2018 Midterms

Russian President Vladimir Putin delivers a speech in Moscow's Red Square last May. Russian-backed efforts attempting to interfere in U.S. politics appear to be evolving.
Yuri Kochetkov
/
AFP/Getty Images
Russian President Vladimir Putin delivers a speech in Moscow's Red Square last May. Russian-backed efforts attempting to interfere in U.S. politics appear to be evolving.

Secretary of State Rex Tillerson sounded an alarm this week: The Russians are already meddling in the 2018 midterm elections.

"The point is that if their intention is to interfere, they're going to find ways to do that," Tillerson told Fox News. "I think it's important we just continue to say to Russia, look, you think we don't see what you're doing. We do see it, and you need to stop."

A new poll shows that a clear majority of Americans believe Russia will try to meddle in the next U.S. election. But Tillerson also noted that Russia's tactics for interfering in U.S. politics are constantly changing. A bipartisan effort is shedding new light on how Russian methods evolve.

Hamilton 68 is a project that has been tracking Russian influence networks on Twitter for more than six months, watching as Kremlin-linked bots and social media amplifiers seek to inject themselves into national controversies, like #ReleaseTheMemo, the Charlottesville white nationalist rally and the NFL kneeling controversy.

Bots are automated accounts that repeatedly churn out tweets. They can be used to support a narrative or retweet other messages to falsely amplify a message.

This project has identified and monitors 600 Twitter accounts linked to Russian influence operations and continuously updates the results through an online dashboard. The tracking effort is part of the Alliance for Securing Democracy, a bipartisan initiative hosted by the German Marshall Fund of the United States.

After more than half a year of tracking Russian influence on Twitter, the project's staff members have seen deeper trends that reveal how the Russian government tries to get its own message out to more Americans. In addition to fanning controversies, Hamilton 68 noticed that the bots are trying to expand American exposure to Russian foreign policy propaganda, in part by slipping it into innocuous conversations.

Amplifying #ReleaseTheMemo and #TakeAKnee

The bread and butter of Russia information operations remains an effort to divide Americans. Russian bots weren't commenting on the NFL, Charlottesville or DACA because they had a stake in the outcome of the political debate, according to those tracking the activity.

"These are not networks that are necessarily always traditional propaganda. ... A lot of it is just trying to rip apart Americans, to sow chaos within our political system, to pit Americans of both parties against each other," said Jamie Fly, a director for the Alliance for Securing Democracy and a former foreign policy aide for Sen. Marco Rubio, R-Fla. "And so a lot of the issues that we see these networks push are actually part of our day-to-day domestic political debates and often have nothing to do with foreign policy."

There has also been a massive increase in the amount of chatter that promotes mistrust of American institutions, especially the idea of an American "deep state," the idea that there is a conspiracy of government officials working to undercut the president.

The deep state narrative was the focus of between 5 and 10 percent of the weekly content on the Russia-linked influence network in October, when Hamilton 68 began its tracking. Last week, it represented 38 percent of the articles linked to by those accounts.

"That's gone from being a sort-of ripple beneath the surface to now that's risen up to where it's typically the top thing we're seeing week to week," said Bret Schafer, the analyst for the project who updates the dashboard each day.

Recruiting eyeballs for Russian propaganda

Looking a little deeper, the Russian influence networks are using intriguing techniques to amplify their reach and change the types of content that Americans read.

"If you look at the dashboard day to day, it tends to follow the topic du jour that's in the general news cycle. So you'll see conversations about DACA or 'release the memo' or NFL protests, so you get caught at looking at the granular level and lose focus of the big-picture narratives," Schafer said. "But when you zoom out and look at what's happened over the past six months, you start seeing some patterns start to emerge."

Oftentimes bots will begin tweeting about totally innocent topics, such as a sporting event or a trending Twitter topic, as a means for expanding their audience.

Schafer said he noticed the trend when he saw a number of their monitored accounts start tweeting about #MondayMotivation and #WednesdayWisdom, popular hashtags used by many American Twitter users, mixed in with content about Syria and Ukraine that Russian bots frequently discuss in defense of Russia's interventions in those countries.

"We look at that as being the recruitment topic," Schafer told NPR. "If they started shouting about Syria and Ukraine ... less people would see it, and it would be less effective."

Still searching for solutions

Still, the bots don't have a unified partisan slant, even though U.S. intelligence agencies concluded that Russian influence operations were aimed at supporting President Trump in the 2016 campaign.

"The reality is that they also turn against Republicans," Fly said. "They have gone after administration officials like [national security adviser H.R.] McMaster. They've attacked [House] Speaker [Paul] Ryan, [Senate Majority] Leader [Mitch] McConnell. They've attacked Republican Sens. Marco Rubio, [Jeff] Flake, Sen. [John] McCain on a regular basis. No one in American politics is immune from the reach of these networks."

And so this is why Fly has teamed up with Laura Rosenberger, who was Hillary Clinton's foreign policy adviser during the 2016 presidential campaign. Together, they're co-directors for the Alliance for Securing Democracy's efforts to track and push back against Russian meddling in the American political system.

"The bipartisan aspect of what we're doing is actually strategically important," Rosenberger said, pointing to the apparent Russian goal of increasing political division.

Even on Capitol Hill, where bipartisanship is the exception rather than the rule, there's a bipartisan understanding that Russia meddled in the American political process during the 2016 presidential election.

Rosenberger pointed to the "nearly unanimous" vote in Congress to impose sanctions on Russia as evidence, although the White House recently said it will not implement Russia-related sanctions that Congress required because it was not necessary at this point.

Still, those on both sides of the aisle understand that there has been no letup.

"The Russian effort to undermine our democracy did not end with Election Day in 2016 — these tools, whether creation of fake accounts, or the use of automated bots, continues on an ongoing basis," Virginia Sen. Mark Warner, the top Democrat on the Senate intelligence committee, told NPR.

"This is not just an elections issue," added Fly, the former Rubio aide. "The reality is that Russian interference never stopped."

But even Warner, who has a front row seat at briefings by the intelligence community, acknowledges that there are as yet no clear legislative answers for how to address Russian influence operations on social media networks.

He has proposed legislation that requires social media networks to disclose the source of paid political ads. But he admits that represents a small part of how Russia influences the American political debate — and puts the onus on tech companies to disrupt automated bots.

"This is really an ongoing national security issue, and I don't think we have come up with a legislative or policy solution yet that fully gets it right," Warner said. "The social media companies really need to work in partnership with us in a way that we get this right, that people can know if an account is being manipulated by a foreign entity. And if it's being pushed by automated bots, I think a user has the right to know that information."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tim Mak is NPR's Washington Investigative Correspondent, focused on political enterprise journalism.