MEXICO CITY/SAN FRANCISCO (Reuters) - Ahead of Mexico’s presidential election on Sunday, Facebook pages criticizing the leftist frontrunner feature posts with thousands of “likes” and no other reactions or comments, suggesting automation, a report on Thursday from the Atlantic Council said.
Many “likes” on the pages attacking Andres Manuel Lopez Obrador, a former Mexico City mayor mounting his third campaign for the presidency, came from Brazil, the Washington-based think tank said. One human “like” came from a user claiming to run a group of social media specialists for hire.
The flurry of social media manipulation as Mexicans prepare to vote highlights how the playbook for information warfare has evolved since the 2016 U.S. presidential election.
Even with plenty of advance warnings, new hiring and partnerships, the social networks are failing to thwart more advanced techniques - boding poorly for attempts to keep deliberate misinformation at bay in coming elections around the world.
Propaganda is reaching Mexican voters through several new avenues. Shared fabrications are more often taking the form of videos, images and memes that multiply faster before detection, researchers say. Pages mix real news with fake to buttress credibility.
“You can’t always talk about aliens if you want people to believe aliens exist,” said former FBI agent Clint Watts, author of a book on social media manipulation.
On both Facebook and Twitter, large, easier-to-spot automated networks are giving way to loosely coordinated, smaller networks, said the Atlantic Council’s Ben Nimmo. And the WhatsApp messaging service owned by Facebook has become a prime channel for spreading falsehoods in closed groups that leave the company and authorities in the dark, researchers say.
“The people who make and build computational propaganda have been and probably always will be one step ahead,” said Samuel Woolley, director of the Digital Intelligence Lab at Institute for the Future.
With Lopez Obrador leading most polls by double digits, the Mexican election is unlikely to turn on social media manipulation, experts say. [nL1N1TU03Q]
But the vote is one of the first big tests since Facebook acknowledged that information from up to 87 million users could have gone to political consultancy Cambridge Analytica, which worked on U.S. President Donald Trump’s campaign.
Facebook has hired more content screeners, funded outside research, and struck deals with fact-checkers from outlets including Agence France-Presse and Verificado, a group backed by Al Jazeera and Mexican publication Animal Politico.
Articles deemed false appear lower in users’ feeds, Facebook says, though it will not provide detail on how effective that is in Mexico. Users are warned before they share the content.
A team of Facebook employees begun gearing up for the Mexican election last fall, meeting weekly, said Diego Bassante, who manages the politics and government team in Latin America. “Can someone create a fake account or a troll? Yeah, probably,” he said. “But eventually, they are going to be found and taken down.”
Nonetheless, several pages denounced for spreading fake news remain on Facebook with large followings.
For example, Nacion Unida, which Verificado dubbed one of the most prolific fake news distributors, had more than 895,000 followers this week. Another, Zocalo Virtual, which has spread false claims about a disgraced ex-governor and billionaire Carlos Slim, had more than 1.86 million followers.
Researchers agree it is difficult to determine who is behind the pages. Many describe themselves as concerned citizens. Others imitate news outlets.
Facebook says it has escalating consequences for pages sharing fake news. Repeat offenders lose their ability to make money on the site.
“It can be a challenge to differentiate between legitimate political discussion and inauthentic coordinated dissemination of misinformation,” said Facebook spokesman Tom Reynolds. “We are firmly committed to fighting false news.”
Mexican businessman Carlos Merlo has bragged in interviews of creating fake news sites and managing networks of fake user accounts to promote them, according to the Atlantic Council’s report Thursday. Merlo’s Victory Lab was cited as a possible threat to authentic debate in an internal Facebook threat assessment months ago.
Yet Victory Lab’s own Facebook page, which attracted thousands of “likes” from users in non-Spanish-speaking countries, remained up until Thursday’s report but Facebook said it would remove the page later in the day.
Merlo could not immediately be reached for comment.
Facebook’s WhatsApp has also emerged as a favorite destination for those seeking to shape the political conversation. It is popular even in rural areas and uses little mobile data, said Monika Glowacki, a student at the UK-based Oxford Internet Institute who is researching disinformation in Mexico.
Many rumors spread on WhatsApp contain an element of truth, such as a statistic, but will present it in a misleading context to spark fear or emotion, Glowacki said. Some messages appear aimed at reducing voter turnout, and they come with greater credibility than content viewed on social media, Glowacki said.
“You’re getting it usually from someone you trust,” she said. “It’s something you’re almost more primed to believe in.”
As bot detection improves, operators are increasingly recruiting real users to be their messengers, said Kris Shaffer, senior analyst with research firm New Knowledge. “It’s this phenomenon of information laundering. It starts out as purposeful,” he said, but “becomes people inadvertently sharing things that are false.”
Some users are getting paid for liking, sharing or commenting on posts, according to outside researchers and a Facebook employee who requested anonymity. Paid activity can be a legitimate job or align with sincere beliefs, in addition to being hard to spot.
Facebook declined to comment on its efforts to identify paid activity.
Reporting by Julia Love in Mexico City and Joseph Menn and David Ingram in San Francisco; Editing by Greg Mitchell and Alistair Bell
Our Standards: The Thomson Reuters Trust Principles.