October 9, 2020 | technology | No Comments
(Bloomberg) — Facebook Inc.’s announcement Thursday that it had shut down a network of phony accounts attempting to influence the November elections reinforced fears that people are working to use social media to undermine U.S. democracy. But unlike 2016, when most attention focused on campaigns associated with the Russian government, this year’s wave of disinformation is coming largely from President Donald Trump and his American supporters, a growing body of research shows, raising new challenges for social media companies.
Facebook tied the campaign it exposed Thursday to Rally Forge, a U.S. marketing firm hired by Turning Point USA, a conservative youth organization that has already been linked to other attempts to manipulate online political debate, and an advocacy organization called the Inclusive Conservation Group. The social network removed 200 Facebook accounts, 55 pages, and 76 Instagram accounts. It also banned Rally Forge.
The fake accounts, created to look like real Facebook users, posted commentary parroting Trump administration talking points on the pages of news organizations. One of the campaign’s inauthentic commenters posted in September, “Mail-in ballots are such a horrible idea. A dangerous amount of ballots will be lost or won’t arrive in time. The smartest thing to do is to vote in person.” The group also spent about $973,000 on Facebook ads.
“Deceptive campaigns like these raise a particularly complex challenge by blurring the line between healthy public debate and manipulation,” Facebook said in a statement. “We know these threats extend beyond our platform and no single organization can tackle them alone.”
Many of messages from the phony users expressed arguments that Trump himself has been making through his Twitter Inc. account, in public comments and in advertisements on various platforms. During last week’s presidential debate, Trump called mail-in ballots a “disaster,” despite a long history of voting by mail that has never been credibly tied to heightened risk of fraud. Both Trump and Vice President Mike Pence have refused to commit to the peaceful transfer of power if they lose the election.
A study published last week by Harvard’s Berkman Klein Center showed how the president promoted disinformation largely by manipulating the mass media into repeating his claims. “There’s no question that the leading force of this disinformation campaign has been President Trump himself,” said Yochai Benkler, the study’s main author, a stance that has been echoed by numerous other experts in the field.
Benkler thinks the fixation on social media has overstated its relative importance compared to traditional news media in spreading disinformation. “We always focus on Twitter because that’s the new shiny object in the media ecosystem, but when we actually looked at the last six months, Trump uses press briefings and news releases every bit as much as he uses Twitter,” he said.
Samantha Zager, a spokeswoman for the Trump campaign, said liberals have been the ones spreading disinformation. “For months, Democrats have attempted to upend the way American vote just before the election, and together with the mainstream media have tried to sweep those changes under the rug while pointing blame at President Trump and Republicans who have been fighting back against chaos for a free and fair election,” she said.
Facebook, as well as Twitter and Alphabet Inc.-owned YouTube, have spent significant energy attempting to harden themselves against foreign manipulations. They’ve rooted out botnets tied to Russia, Iran, North Korea and China, and shut down thousands of inauthentic accounts tied to those governments. There is evidence of continuing state-sponsored attempts to interfere in the election through social media.
The action against Rally Forge fits into a similar mode, targeting not what users say but the ways they manipulate social media platforms to spread their messages. Cutting down on professional attempts to undermine the election through social media should be the top priority of companies like Facebook, said John Sebes, the chief technology officer at the Trust the Vote Project. “Absent professional manipulation you have a ceiling of natural organic effect,” he said. Much of the most sophisticated manipulation, still comes from foreign entities, according to Sebes.
Other experts are increasingly worried about the risks posed by information spread not by paid trolls but by actual people. This poses its own challenges. Taking action against an elected official who is making false claims, for instance, requires Facebook to weigh in on the content of the message itself, rather than the manner in which it is spread. The company has faced steady criticism over its reluctance to take down posts from the president that seem to violate its policies. This week, Facebook removed a post in which Trump falsely claimed the flu was more deadly than the coronavirus, but only after the message had accumulated hundreds of thousands of likes and been shared more than 50,000 times. Twitter kept it online but included additional information.
Facebook also recently announced it will stop running political advertisements indefinitely after the Nov. 3 election, in an effort to prevent politicians from claiming victory before the race has been called. The company added that it would make clear on its website and apps that no winner has been declared until news outlets like the Associated Press and Reuters determine an outcome.
Another challenge is that tech platforms are being used by domestic extremists to recruit new members and plan violent events. On Thursday, the Federal Bureau of Investigation arrested a group of men who were allegedly plotting to kidnap the Democratic governor of Michigan. The men are suspected of planning the kidnapping on Facebook groups, and also posting anti-government messages on YouTube.
Troll farms tied to the Russian government have little official support within the U.S., said Joshua Tucker, a professor at New York University, allowing Facebook to move against them with little risk of public pushback. But the shape of disinformation heading into next month’s election has become far thornier.
“If you are pretending to be a Navy Seal in Tennessee putting all sorts of misinformation out onto the internet but it turns out you’re actually a Russian troll, it’s easy,” said Tucker. “If you’re actually a Navy Seal sitting in Tennessee, then it’s a lot harder for Facebook to justify that.”
For more articles like this, please visit us at bloomberg.com
©2020 Bloomberg L.P.