Twitter and Google’s moves to limit election-related disinformation on their platforms haven’t been met with positive reviews from most industry professionals who warn campaigns will need to be hyper-vigilant in the run-up to Election Day.
“It’s a positive development for platforms to take disinformation about the democratic process seriously and I hope their intention matches up with their ability to enforce policies that prove effective,” said Jiore Craig, a VP at GQR who specializes in combating disinformation. “We won’t be out of the woods with these changes alone and everyone should be prepared to vet information before sharing and direct friends and family to trusted sources.
“Platforms should hold actors that break the rules accountable with meaningful restrictions or removal, so they aren’t in a position to spread false or confusing information on or after Election Day.”
Late last week Twitter said it will “label or remove false or misleading information” that undermines the process or disputes the outcome including “unverified information about election rigging, ballot tampering, vote tallying, or certification of election results.”
But practitioners noted before the announcement that Twitter was already at a loss to moderate an unending flow of organic posts containing misinformation — a criticism also leveled at Facebook when earlier this month it announced restrictions on ad placement in the week before Election Day.
Democratic digital consultant Jenna Lowenstein pointed to Rep. Steve Scalise (R-La.) in August sharing an altered video of activist Ady Barkan on Twitter. The company labeled the tweet as “manipulated,” and it’s since been taken down.
But having misinformation or disinformation spread by an official as high-ranking as the GOP House minority whip shows how difficult it is to combat, Lowenstein said Wednesday during a virtual panel on disinformation hosted by Georgetown University.
During the 2020 primary, Lowenstein managed Sen. Cory Booker’s (D-N.J.) campaign and previously worked as the digital director for Hillary Clinton’s 2016 run. She noted that candidates of color and female candidates — both of whom are running in record numbers this cycle — face the greatest challenges when it comes to combating disinformation.
“There’s something to tease out about who do we believe false information about and where society is already willing to believe in half-truths,” said Lowenstein. “I think candidates of color and women candidates are more susceptible to that.”
Also on Thursday, Google said it was taking steps to limit the spread of disinformation by changing how its Autocomplete feature works related to the election.
“[W]e will remove predictions that could be interpreted as claims for or against any candidate or political party,” Google said in a blog post. “We will also remove predictions that could be interpreted as a claim about participation in the election—like statements about voting methods, requirements, or the status of voting locations—or the integrity or legitimacy of electoral processes, such as the security of the election.”
But here Lowenstein noted another weakness to Google’s approach, which doesn’t include how organic search results are displayed. She pointed to Google searches related to whether Democratic VP nominee Kamala Harris was disgraced actor Jussie Smollett’s aunt. (She’s not).
The misinformation connection emerged after Booker and Harris, together with Republican Sen. Tim Scott (R), co-sponsored an anti-lynching bill in the Senate that was passed by a voice vote around the same time the actor reported he was attacked by two men in Chicago who yelled racial slurs. (Smollett’s facing felony charges related to the January 2019 incident.)
The connection made online between Harris and Smollett was something that “just didn’t happen with Cory Booker,” said Lowenstein.
“I do think there’s a willingness to believe misinformation or disinformation about female candidates,” she said.
As for working with the platforms to stem the tide of disinformation, Lowenstein said that based on her experience in 2016 with the Clinton campaign, candidates should be skeptical of the companies’ willingness to police disinformation.
“To our error, we misattributed a lot of what was happening to the reality of being a high-profile woman, being a first,” she said.
But as the campaign went into the fall, they “started to see really pervasive disinformation about how and when to vote … content pushed to Clinton supporters about voting on Monday and voting by text message.”
In that context, Lowenstein said they sought help from the companies but realized they “didn’t have reliable partners on the platform.”