The ease with which disinformation can be disseminated – and even fool credible news organizations – shows how little has changed since Russia tried to influence the 2016 presidential election. Despite improvements during the past three years, we’re not ready for what‘s coming, say authors of a November report assessing Russia’s online tactics.
COLUMBUS, Ohio — Tammy Texoma’s Twitter feed shows lots of love for President Donald Trump, and lots of contempt for the president’s detractors.
Her highly partisan online missives even caught the attention of Newsweek last fall, which merely identified her as “a Twitter user.”
But a deeper dive reveals much more about this social media account — and provides a glimpse into a dark cyberworld of shadowy figures trying to influence American voters during another presidential campaign.
The same photo of the smiling, short-haired blonde that appears in “Tammy’s” Twitter profile also has been used for a porn actress named Kelsey Hess, a Columbus Dispatch reverse-image search shows. And she shows up again as Kiara Owens on a Russian website that advertises for dates with Russian speakers.
In recent years, she also has been Dazi, ReginaLover120, Rose, Candy Lucas, Sandra Smith or Heather Benard, depending on the web link.
Many of those connections are no longer active, but one that remains live takes users to explicit poses of the same woman — except that in this account, she is “Patience” with a Las Vegas escort service.
Though “Tammy” seems unlikely to influence the 2020 U.S. election, the ease with which such disinformation can be disseminated – and even fool credible news organizations – shows how little has changed since Russia tried to influence the 2016 presidential election.
Despite improvements during the past three years, we’re not ready for what‘s coming, say authors of a November report assessing Russia’s online tactics.
“It is quite possible that these exact techniques will be used again. And why shouldn’t they? We’ve done almost nothing to counter the threat,” the Stanford University study concluded.
Dariya Tsyrenzhapova, a data journalist and University of Texas doctoral student who grew up in Kazakhstan, has an even gloomier forecast for this year‘s elections: “I think the disinformation campaigns are likely to become more sophisticated.”
Politics professor Joshua Tucker, director of New York University‘s Jordan Center for the Advanced Study of Russia, worries that, as America works to prevent “another 2016,” Russia will stay a step ahead by moving to other platforms this year.
He warns of an even greater threat: Now that U.S. political operatives have seen the Russian toolkit firsthand, domestic sources could become the biggest source of disinformation in what likely will shape up to be a bitter presidential campaign.
Fakes and deep fakes
Disinformation takes many faces, underscored by events in recent weeks. Some exhibit a coordinated, malicious intent; others seem more a consequence of everyone having an online megaphone.
As if the Feb. 3 Iowa caucuses needed disinformation on top of the state‘s vote-counting woes, a conservative group conflated data and raised the possibility of voter fraud – a charge that was retweeted by Fox celebrity Sean Hannity.
The ensuing furor caused Iowa’s GOP secretary of state to demand that the group stop spreading false claims as people were preparing to vote.
With critics calling reforms by social media companies inadequate, Twitter suspended the account of Andrew Ogg (@MuchoGucci) after he tweeted a lie on the afternoon of the caucus saying “Confirmed Reports About @PeteButtigieg Officially Suspending his Campaign Hours Before Iowa.”
Not all disinformation takes place on social media, of course.
This month, the Post and Courier in Charleston, South Carolina, uncovered a plot by the state’s GOP to aggressively boost Vermont Sen. Bernie Sanders in the Feb. 29 Democratic primary because he is viewed as the weakest potential challenger to President Trump.
One of the most egregious displays of disinformation from the past year is the altering of a video to show a slowed-down version of a May speech by House Speaker Nancy Pelosi that gave the illusion she was intoxicated.
The Washington Post reported that one version, posted by the conservative Facebook page Politics WatchDog, was viewed more than 2 million times within a couple of days after its posting. It was shared more than 45,000 times and got some 23,000 comments. Experts said the changes to the video showed surprising sophistication – the ability not only to reduce the speed of the video portion by 25% but also keep her speech understandable.
Trump attorney Rudolph Giuliani linked to the fake video through a tweet, though that‘s since been deleted. Trump tweeted a separate altered Pelosi video from Fox Business, which was edited to combine about 30 seconds of moments when she paused or stumbled over her words during a 20-minute talk. The president falsely said it showed her stammering through a news conference.
The 2016 campaign reinforced what propagandists know: Lies gain strength when they are frequently repeated.
“In studies where participants rated statements on their truthfulness, results showed that repetition increased participants’ perceptions of the truthfulness of false statements, even when participants knew these statements were false,” said a 2019 State Department-sponsored study.
But the nightmare of many is the “deep fake” video. That’s when specialized software enables manipulators to make a real person appear to say whatever they want him or her to say in a video.
The technique was vividly demonstrated by Claire Wardle, executive director of the group First Draft, in a video for the New York Times in which she initially “appears” as English singer Adele. Wardle said the danger from such fakes might be overblown, but it does contribute to a climate in which the public can‘t even believe the veracity of a video.
First Draft is “a global nonprofit that supports journalists, academics and technologists working to address challenges relating to trust and truth in the digital age.”
As a key part of its battle against disinformation, First Draft trains reporters to ferret out the major falsehoods on the internet, and researching the Twitter account of Tammy Texoma was one of the training exercises.
Many eyes were opened a couple of years ago by a video showing former President Barack Obama “speaking” — except that it was really comedian Jordan Peele.
The messengers with ill intent have come a long way since the faked photo 16 years ago purporting to show Jane Fonda with 2004 Democratic presidential nominee John Kerry at a 1970 anti-war rally.
Ohio hit hard
During the 2016 campaign, Ohio and other traditional battleground states were especially targeted.
More than 6,500 tweets identified by Twitter as from Russia contained “Ohio” in the subject line, according to a Dispatch analysis of a compilation of data released by the U.S. Senate Intelligence Committee in 2018.
One from “Finley1589” said, “i love working at the post office in Columbus, Ohio and ripping up absentee ballots that vote for trump,” during the early voting period before the Nov. 8 election.
Another, from ameliebaldwin, said, “HEY FAKE NEWS MEDIA – VOTER FRAUD! Over 800 Non-Citizens In Ohio Were Registered To Vote.”
Perhaps the most visible disinformation in Ohio during the 2016 campaign came from a tweet claiming that thousands of votes for Hillary Clinton were stockpiled in a Columbus warehouse. The Franklin County Board of Elections investigated the claim before declaring it fake. The photo accompanying the story actually was taken in 2015 in Birmingham, England.
Still, the story circulated so widely that Secretary of State Jon Husted, now Ohio’s lieutenant governor, was compelled to issue an official statement knocking down the false claim by “Christian Times Newspaper.”
A sub-genre of 2016 disinformation targeted conservative Christians. One of the most enduring images of the campaign was spread by the Russian-created Army of Jesus. It portrayed Jesus and Satan arm-wrestling, with the devil saying, “If I win Clinton wins.”
Does anybody really believe these messages?
Though the falsehoods stand out now, the Russians were clever in the months before the election, mixing in numerous stories from legitimate Ohio news sources, says New York University’s Tucker. They apparently were sophisticated enough to know that local news sources are regarded by Americans as the most credible, which in turn would make their tweets more believable, he said.
During a presentation at Ohio State University, he said that many of the stories shared were chosen to increase divisiveness among Americans, Tucker said.
“Part of what the Russians were trying to do was to stir up polarization in the country,” Tucker said.
Multiple studies show that the Russians targeted groups that feel disenfranchised, and they centered on wedge issues, such as race, immigration, the influence of Muslims, and guns.
For example, the Russians shared several cleveland.com stories on the controversial shooting of 12-year-old Tamir Rice by a Cleveland police officer, as well as others about violence on and by law enforcement.
Some in the disinformation operation based in St. Petersburg, Russia, posed as black women to gain entre into African American social networks, Tucker said. He found they generally limited their discussion to such issues as Black Lives Matter until just before the election – when they suddenly began urging blacks not to waste time voting for Clinton.
Sorting it out
Sergey Sanovich, a Russia native who is with Princeton University‘s Center for Information Technology Policy, said the KGB’s playbook extends far beyond an election or two. Its intent is to sow long-term distrust, destroying the public‘s belief they can hold their leaders accountable.
After all, he said, it’s worked in the motherland: “Everything Russia did to the (U.S.) in 2016, it did in Russia 10 years before.”
Russian Garry Kasparov knows that well. The former world chess champion who is now chairman of the Human Rights Foundation, said during the 2016 campaign: “The point of modern propaganda isn’t only to misinform or push an agenda. It is to exhaust your critical thinking, to annihilate truth.”
Tucker said the increasing belief that fake news is everywhere represents a larger danger than fake news itself.
Americans are especially susceptible right now, said Erik Nisbet, an Ohio State communication professor who recently organized an event with scholars such as Sanovich and Tucker who have studied international disinformation efforts.
“Anxiety and anger increase the likelihood of endorsing false information,” Nisbet said, possibly leading to a condition that psychologists dub informational ”learned helplessness.”
He cited a multi-nation study by the University of Cambridge’s Centre for the Future of Democracy showing growing dissatisfaction with democracy itself, especially among older countries. In the United States, the dissatisfaction was less than 25% in 1995, the first year of the Cambridge survey. Now it tops 50%.
The rich environment for disinformation is a big reason he is taking part in a new cross-disciplinary effort at OSU, which will include the Mershon Center for International Security Studies, to build the resiliency needed to overcome the purveyors of falsehoods.
Another member of that team, communication professor Kelly Garrett, said the worst thing Americans can do is give up.
“Don’t throw up your hands and say, ‘Oh, there’s just so much noise out there we can’t tell what is true,’ because that is self-defeating. That is actually more problematic,” he said.
“We want to be very careful not to become discouraged about our ability to distinguish truth from falsehoods. We actually have a lot of resources for doing it. People are actually pretty good at it most of the time.”