Delete Set public Set private Add tags Delete tags
  Add tag   Cancel
  Delete tag   Cancel
  • Cyberpunk is the way
  • My links
  • Tags
  • pics
  • Daily
  • RSS
  • Login
Filter untagged links
3 results tagged Media Manipulation

Comment la Russie influence les élections en France grâce à une armée de botshttps://www.futura-sciences.com/tech/actualites/guerre-futur-russie-influence-elections-france-grace-armee-bots-114424/

  • War
  • Disinformation War
  • Media Manipulation
  • Social Network
  • PsyOps
  • War
  • Disinformation War
  • Media Manipulation
  • Social Network
  • PsyOps

Comment la Russie influence les élections en France grâce à une armée de bots

L’étude du docteur Chavalarias montre que les campagnes d’influence sur X cherchent à affaiblir le front républicain au profit de l’extrême droite. © Login, Adobe Stock (image générée avec IA)

Une étude d’un chercheur du CNRS révèle les manœuvres de déstabilisation menées par le Kremlin sur les réseaux sociaux pour faire monter l’extrême droite en France. Les narratifs poussés par la Russie suffisent-ils pour manipuler l’opinion des électeurs ?

par Sylvain Biget le 5 juillet 2024 pour Futura-science

« La goutte d'eau creuse la pierre, non par force, mais en tombant souvent. » Cette ancienne devise du KGB reste d'actualité avec un président Poutine élu à vie et qui mise sur le long terme pour influencer par petits pas les élections dans les démocraties occidentales. Le champ de bataille des manœuvres d'ingérence russes se situe sur les réseaux sociaux, et cela ne date pas d'hier. L'idée motrice consiste à affaiblir l'Union européenne et l'Otan en manipulant les opinions des populations des pays membres.

Pour y parvenir, le Kremlin cherche à aider à élire des dirigeants moins hostiles au régime russe. C'est le cas de l'extrême droite française et notamment du RN qui a bénéficié d'un emprunt russe pour ses campagnes et s'est systématiquement opposé ou abstenu dès qu'il s'agissait de soutenir la résistance ukrainienne ou d'infliger des sanctions à la Russie suite à l'invasion de l'Ukraine.

La tactique est connue depuis longtemps, puisque dès les élections présidentielles de 2017, des opérations de piratage des boîtes e-mail de l'équipe de campagne d'Emmanuel Macron avaient été menées pour tenter de discréditer le candidat. La manœuvre était alors soutenue par des armées de bots menant des campagnes intensives d'astrosurfing pour amplifier un narratif favorable à l'élection de Marine Le Pen. Aujourd'hui, une récente étude, menée par le chercheur du CNRS David Chavalarias, analyse les techniques de déstabilisation employées par le Kremlin pour manipuler l'opinion des électeurs afin qu'ils votent pour le RN dans le cadre des élections législatives anticipées.

L'auteur explique que déjà, dès 2023, lors des campagnes pour les élections européennes, des publicités ciblées ont été achetées sur Facebook pour pousser les messages allant à l'encontre des gouvernements, ou le soutien des pays occidentaux à la résistance ukrainienne. Ces campagnes se sont amplifiées à l'approche des élections. Par exemple, de fausses publicités de recrutement de soldats pour l'armée française afin d'aller combattre en Ukraine ont abondamment circulé. Elles visaient à renforcer les propos d'Emmanuel Macron sur l'envoi de troupes en Ukraine pour le faire passer pour un va-t-en-guerre.

La représentation graphique des communautés politiques. En nuances de rouge, les partis liés à la gauche. Ceux de droite sont représentés en bleu. Les filaments représentent les flux d’échanges et de partages de publications entre comptes X. © CNRS

La représentation graphique des communautés politiques. En nuances de rouge, les partis liés à la gauche. Ceux de droite sont représentés en bleu. Les filaments représentent les flux d’échanges et de partages de publications entre comptes X. © CNRS

Un pouvoir de nuisance surestimé ?

Mais c'est sur le réseau X que les manœuvres des bots russes et des propagateurs de narratifs prorusses sont les plus abondantes. Depuis 2016, une armée de faux comptes diffuse régulièrement des concepts reposant sur des mots-clés poussant au clivage. C'est ainsi que le terme « islamo-gauchiste », qui venait de nulle part, a été propulsé et a été repris par des ministres et fait l'objet de débats.

Mais plus globalement, selon le chercheur, le Kremlin a employé trois stratégies : pousser la normalisation de l'extrême droite, faire en sorte que les partis du front républicain ne puissent plus s'entendre et surtout engendrer le rejet des partis modérés par les électeurs afin qu'ils votent à l'extrême droite. Pour y parvenir, les bots russes surfent aussi sur l'actualité et notamment la guerre à Gaza en diffusant des images horribles de l'attaque du Hamas du 7 octobre.

Une façon de faire monter l'islamophobie, pointer du doigt l'antisémitisme de certains partis et augmenter les discours radicaux entre l'extrême droite et l'extrême gauche. Pour accentuer l'effet, des bots faisant la promotion de l'islamisme politique ont été créés. Ces manœuvres, qui sont toujours en cours, visent toujours à amplifier l'adhésion au RN d'un maximum d'électeurs pour le second tour des législatives de ce dimanche.

Si l'analyse du chercheur est solide, reste à savoir si cette stratégie du Kremlin pèse vraiment sur le choix des électeurs. Il n'existe pour l'heure aucune étude aboutie sur les effets concrets auprès du public de ce genre de manipulation. Sans pour autant sous-estimer le pouvoir de ces opérations, rien que la montée en puissance du réseau de médias conservateurs du groupe Bolloré peut expliquer bien des choses.

Permalink
July 7, 2024 at 4:05:53 PM GMT+2

Pentagon ran secret anti-vax campaign to incite fear of China vaccineshttps://www.reuters.com/investigates/special-report/usa-covid-propaganda/

  • Disinformation War
  • PsyOps
  • Social Network
  • Media Manipulation
  • Disinformation War
  • PsyOps
  • Social Network
  • Media Manipulation

Pentagon ran secret anti-vax campaign to undermine China during pandemic

The U.S. military launched a clandestine program amid the COVID crisis to discredit China’s Sinovac inoculation – payback for Beijing’s efforts to blame Washington for the pandemic. One target: the Filipino public. Health experts say the gambit was indefensible and put innocent lives at risk.

By CHRIS BING and JOEL SCHECTMAN Filed June 14, 2024, 9:45 a.m. GMT

At the height of the COVID-19 pandemic, the U.S. military launched a secret campaign to counter what it perceived as China’s growing influence in the Philippines, a nation hit especially hard by the deadly virus.

The clandestine operation has not been previously reported. It aimed to sow doubt about the safety and efficacy of vaccines and other life-saving aid that was being supplied by China, a Reuters investigation found. Through phony internet accounts meant to impersonate Filipinos, the military’s propaganda efforts morphed into an anti-vax campaign. Social media posts decried the quality of face masks, test kits and the first vaccine that would become available in the Philippines – China’s Sinovac inoculation.

Reuters identified at least 300 accounts on X, formerly Twitter, that matched descriptions shared by former U.S. military officials familiar with the Philippines operation. Almost all were created in the summer of 2020 and centered on the slogan #Chinaangvirus – Tagalog for China is the virus.

“COVID came from China and the VACCINE also came from China, don’t trust China!” one typical tweet from July 2020 read in Tagalog. The words were next to a photo of a syringe beside a Chinese flag and a soaring chart of infections. Another post read: “From China – PPE, Face Mask, Vaccine: FAKE. But the Coronavirus is real.”

After Reuters asked X about the accounts, the social media company removed the profiles, determining they were part of a coordinated bot campaign based on activity patterns and internal data.

The U.S. military’s anti-vax effort began in the spring of 2020 and expanded beyond Southeast Asia before it was terminated in mid-2021, Reuters determined. Tailoring the propaganda campaign to local audiences across Central Asia and the Middle East, the Pentagon used a combination of fake social media accounts on multiple platforms to spread fear of China’s vaccines among Muslims at a time when the virus was killing tens of thousands of people each day. A key part of the strategy: amplify the disputed contention that, because vaccines sometimes contain pork gelatin, China’s shots could be considered forbidden under Islamic law.

The military program started under former President Donald Trump and continued months into Joe Biden’s presidency, Reuters found – even after alarmed social media executives warned the new administration that the Pentagon had been trafficking in COVID misinformation. The Biden White House issued an edict in spring 2021 banning the anti-vax effort, which also disparaged vaccines produced by other rivals, and the Pentagon initiated an internal review, Reuters found.

“I don’t think it’s defensible. I’m extremely dismayed, disappointed and disillusioned to hear that the U.S. government would do that.”

Daniel Lucey, infectious disease specialist at Dartmouth’s Geisel School of Medicine.

The U.S. military is prohibited from targeting Americans with propaganda, and Reuters found no evidence the Pentagon’s influence operation did so.

Spokespeople for Trump and Biden did not respond to requests for comment about the clandestine program.

A senior Defense Department official acknowledged the U.S. military engaged in secret propaganda to disparage China’s vaccine in the developing world, but the official declined to provide details.

A Pentagon spokeswoman said the U.S. military “uses a variety of platforms, including social media, to counter those malign influence attacks aimed at the U.S., allies, and partners.” She also noted that China had started a “disinformation campaign to falsely blame the United States for the spread of COVID-19.”

In an email, the Chinese Ministry of Foreign Affairs said that it has long maintained the U.S. government manipulates social media and spreads misinformation.

Manila’s embassy in Washington did not respond to Reuters inquiries, including whether it had been aware of the Pentagon operation. A spokesperson for the Philippines Department of Health, however, said the “findings by Reuters deserve to be investigated and heard by the appropriate authorities of the involved countries.” Some aide workers in the Philippines, when told of the U.S. military propaganda effort by Reuters, expressed outrage.

Briefed on the Pentagon’s secret anti-vax campaign by Reuters, some American public health experts also condemned the program, saying it put civilians in jeopardy for potential geopolitical gain. An operation meant to win hearts and minds endangered lives, they said.

“I don’t think it’s defensible,” said Daniel Lucey, an infectious disease specialist at Dartmouth’s Geisel School of Medicine. “I’m extremely dismayed, disappointed and disillusioned to hear that the U.S. government would do that,” said Lucey, a former military physician who assisted in the response to the 2001 anthrax attacks.

The effort to stoke fear about Chinese inoculations risked undermining overall public trust in government health initiatives, including U.S.-made vaccines that became available later, Lucey and others said. Although the Chinese vaccines were found to be less effective than the American-led shots by Pfizer and Moderna, all were approved by the World Health Organization. Sinovac did not respond to a Reuters request for comment.

Academic research published recently has shown that, when individuals develop skepticism toward a single vaccine, those doubts often lead to uncertainty about other inoculations. Lucey and other health experts say they saw such a scenario play out in Pakistan, where the Central Intelligence Agency used a fake hepatitis vaccination program in Abbottabad as cover to hunt for Osama bin Laden, the terrorist mastermind behind the attacks of September 11, 2001. Discovery of the ruse led to a backlash against an unrelated polio vaccination campaign, including attacks on healthcare workers, contributing to the reemergence of the deadly disease in the country.

“It should have been in our interest to get as much vaccine in people’s arms as possible,” said Greg Treverton, former chairman of the U.S. National Intelligence Council, which coordinates the analysis and strategy of Washington’s many spy agencies. What the Pentagon did, Treverton said, “crosses a line.”

‘We were desperate’

Together, the phony accounts used by the military had tens of thousands of followers during the program. Reuters could not determine how widely the anti-vax material and other Pentagon-planted disinformation was viewed, or to what extent the posts may have caused COVID deaths by dissuading people from getting vaccinated.

In the wake of the U.S. propaganda efforts, however, then-Philippines President Rodrigo Duterte had grown so dismayed by how few Filipinos were willing to be inoculated that he threatened to arrest people who refused vaccinations.

“You choose, vaccine or I will have you jailed,” a masked Duterte said in a televised address in June 2021. “There is a crisis in this country … I’m just exasperated by Filipinos not heeding the government.”

When he addressed the vaccination issue, the Philippines had among the worst inoculation rates in Southeast Asia. Only 2.1 million of its 114 million citizens were fully vaccinated – far short of the government’s target of 70 million. By the time Duterte spoke, COVID cases exceeded 1.3 million, and almost 24,000 Filipinos had died from the virus. The difficulty in vaccinating the population contributed to the worst death rate in the region.

A spokesperson for Duterte did not make the former president available for an interview.

Some Filipino healthcare professionals and former officials contacted by Reuters were shocked by the U.S. anti-vax effort, which they say exploited an already vulnerable citizenry. Public concerns about a Dengue fever vaccine, rolled out in the Philippines in 2016, had led to broad skepticism toward inoculations overall, said Lulu Bravo, executive director of the Philippine Foundation for Vaccination. The Pentagon campaign preyed on those fears.

“Why did you do it when people were dying? We were desperate,” said Dr. Nina Castillo-Carandang, a former adviser to the World Health Organization and Philippines government during the pandemic. “We don’t have our own vaccine capacity,” she noted, and the U.S. propaganda effort “contributed even more salt into the wound.”

The campaign also reinforced what one former health secretary called a longstanding suspicion of China, most recently because of aggressive behavior by Beijing in disputed areas of the South China Sea. Filipinos were unwilling to trust China’s Sinovac, which first became available in the country in March 2021, said Esperanza Cabral, who served as health secretary under President Gloria Macapagal Arroyo. Cabral said she had been unaware of the U.S. military’s secret operation.

“I’m sure that there are lots of people who died from COVID who did not need to die from COVID,” she said.

To implement the anti-vax campaign, the Defense Department overrode strong objections from top U.S. diplomats in Southeast Asia at the time, Reuters found. Sources involved in its planning and execution say the Pentagon, which ran the program through the military’s psychological operations center in Tampa, Florida, disregarded the collateral impact that such propaganda may have on innocent Filipinos.

“We weren’t looking at this from a public health perspective,” said a senior military officer involved in the program. “We were looking at how we could drag China through the mud.”

A new disinformation war

In uncovering the secret U.S. military operation, Reuters interviewed more than two dozen current and former U.S officials, military contractors, social media analysts and academic researchers. Reporters also reviewed Facebook, X and Instagram posts, technical data and documents about a set of fake social media accounts used by the U.S. military. Some were active for more than five years.

Clandestine psychological operations are among the government’s most highly sensitive programs. Knowledge of their existence is limited to a small group of people within U.S. intelligence and military agencies. Such programs are treated with special caution because their exposure could damage foreign alliances or escalate conflict with rivals.

Over the last decade, some U.S. national security officials have pushed for a return to the kind of aggressive clandestine propaganda operations against rivals that the United States’ wielded during the Cold War. Following the 2016 U.S. presidential election, in which Russia used a combination of hacks and leaks to influence voters, the calls to fight back grew louder inside Washington.

In 2019, Trump authorized the Central Intelligence Agency to launch a clandestine campaign on Chinese social media aimed at turning public opinion in China against its government, Reuters reported in March. As part of that effort, a small group of operatives used bogus online identities to spread disparaging narratives about Xi Jinping’s government.

COVID-19 galvanized the drive to wage psychological operations against China. One former senior Pentagon leader described the pandemic as a “bolt of energy” that finally ignited the long delayed counteroffensive against China’s influence war.

The Pentagon’s anti-vax propaganda came in response to China’s own efforts to spread false information about the origins of COVID. The virus first emerged in China in late 2019. But in March 2020, Chinese government officials claimed without evidence that the virus may have been first brought to China by an American service member who participated in an international military sports competition in Wuhan the previous year. Chinese officials also suggested that the virus may have originated in a U.S. Army research facility at Fort Detrick, Maryland. There’s no evidence for that assertion.

Mirroring Beijing’s public statements, Chinese intelligence operatives set up networks of fake social media accounts to promote the Fort Detrick conspiracy, according to a U.S. Justice Department complaint.

China’s messaging got Washington’s attention. Trump subsequently coined the term “China virus” as a response to Beijing’s accusation that the U.S. military exported COVID to Wuhan.

“That was false. And rather than having an argument, I said, ‘I have to call it where it came from,’” Trump said in a March 2020 news conference. “It did come from China.”

China’s Foreign Ministry said in an email that it opposed “actions to politicize the origins question and stigmatize China.” The ministry had no comment about the Justice Department’s complaint.

Beijing didn’t limit its global influence efforts to propaganda. It announced an ambitious COVID assistance program, which included sending masks, ventilators and its own vaccines – still being tested at the time – to struggling countries. In May 2020, Xi announced that the vaccine China was developing would be made available as a “global public good,” and would ensure “vaccine accessibility and affordability in developing countries.” Sinovac was the primary vaccine available in the Philippines for about a year until U.S.-made vaccines became more widely available there in early 2022.

Washington’s plan, called Operation Warp Speed, was different. It favored inoculating Americans first, and it placed no restrictions on what pharmaceutical companies could charge developing countries for the remaining vaccines not used by the United States. The deal allowed the companies to “play hardball” with developing countries, forcing them to accept high prices, said Lawrence Gostin, a professor of medicine at Georgetown University who has worked with the World Health Organization.

The deal “sucked most of the supply out of the global market,” Gostin said. “The United States took a very determined America First approach.”

To Washington’s alarm, China’s offers of assistance were tilting the geopolitical playing field across the developing world, including in the Philippines, where the government faced upwards of 100,000 infections in the early months of the pandemic.

The U.S. relationship with Manila had grown tense after the 2016 election of the bombastic Duterte. A staunch critic of the United States, he had threatened to cancel a key pact that allows the U.S. military to maintain legal jurisdiction over American troops stationed in the country.

Duterte said in a July 2020 speech he had made “a plea” to Xi that the Philippines be at the front of the line as China rolled out vaccines. He vowed in the same speech that the Philippines would no longer challenge Beijing’s aggressive expansion in the South China Sea, upending a key security understanding Manila had long held with Washington.

“China is claiming it. We are claiming it. China has the arms, we do not have it.” Duterte said. “So, it is simple as that.”

Days later, China’s foreign minister announced Beijing would grant Duterte’s plea for priority access to the vaccine, as part of a “new highlight in bilateral relations.”

China’s growing influence fueled efforts by U.S. military leaders to launch the secret propaganda operation Reuters uncovered.

“We didn’t do a good job sharing vaccines with partners,” a senior U.S. military officer directly involved in the campaign in Southeast Asia told Reuters. “So what was left to us was to throw shade on China’s.”

Military trumped diplomats

U.S. military leaders feared that China’s COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing, furthering its regional ambitions.

A senior U.S. military commander responsible for Southeast Asia, Special Operations Command Pacific General Jonathan Braga, pressed his bosses in Washington to fight back in the so-called information space, according to three former Pentagon officials.

The commander initially wanted to punch back at Beijing in Southeast Asia. The goal: to ensure the region understood the origin of COVID while promoting skepticism toward what were then still-untested vaccines offered by a country that they said had lied continually since the start of the pandemic.

A spokesperson for Special Operations Command declined to comment.

At least six senior State Department officials responsible for the region objected to this approach. A health crisis was the wrong time to instill fear or anger through a psychological operation, or psyop, they argued during Zoom calls with the Pentagon.

“We’re stooping lower than the Chinese and we should not be doing that,” said a former senior State Department official for the region who fought against the military operation.

While the Pentagon saw Washington’s rapidly diminishing influence in the Philippines as a call to action, the withering partnership led American diplomats to plead for caution.

“The relationship is hanging from a thread,” another former senior U.S. diplomat recounted. “Is this the moment you want to do a psyop in the Philippines? Is it worth the risk?”

In the past, such opposition from the State Department might have proved fatal to the program. Previously in peacetime, the Pentagon needed approval of embassy officials before conducting psychological operations in a country, often hamstringing commanders seeking to quickly respond to Beijing’s messaging, three former Pentagon officials told Reuters.

But in 2019, before COVID surfaced in full force, then-Secretary of Defense Mark Esper signed a secret order that later paved the way for the launch of the U.S. military propaganda campaign. The order elevated the Pentagon’s competition with China and Russia to the priority of active combat, enabling commanders to sidestep the State Department when conducting psyops against those adversaries. The Pentagon spending bill passed by Congress that year also explicitly authorized the military to conduct clandestine influence operations against other countries, even “outside of areas of active hostilities.”

Esper, through a spokesperson, declined to comment. A State Department spokesperson referred questions to the Pentagon.

U.S. propaganda machine

In spring 2020, special-ops commander Braga turned to a cadre of psychological-warfare soldiers and contractors in Tampa to counter Beijing’s COVID efforts. Colleagues say Braga was a longtime advocate of increasing the use of propaganda operations in global competition. In trailers and squat buildings at a facility on Tampa’s MacDill Air Force Base, U.S. military personnel and contractors would use anonymous accounts on X, Facebook and other social media to spread what became an anti-vax message. The facility remains the Pentagon’s clandestine propaganda factory.

Psychological warfare has played a role in U.S. military operations for more than a hundred years, although it has changed in style and substance over time. So-called psyopers were best known following World War II for their supporting role in combat missions across Vietnam, Korea and Kuwait, often dropping leaflets to confuse the enemy or encourage their surrender.

After the al Qaeda attacks of 2001, the United States was fighting a borderless, shadowy enemy, and the Pentagon began to wage a more ambitious kind of psychological combat previously associated only with the CIA. The Pentagon set up front news outlets, paid off prominent local figures, and sometimes funded television soap operas in order to turn local populations against militant groups or Iranian-backed militias, former national security officials told Reuters.

Unlike earlier psyop missions, which sought specific tactical advantage on the battlefield, the post-9/11 operations hoped to create broader change in public opinion across entire regions.

By 2010, the military began using social media tools, leveraging phony accounts to spread messages of sympathetic local voices – themselves often secretly paid by the United States government. As time passed, a growing web of military and intelligence contractors built online news websites to pump U.S.-approved narratives into foreign countries. Today, the military employs a sprawling ecosystem of social media influencers, front groups and covertly placed digital advertisements to influence overseas audiences, according to current and former military officials.

China’s efforts to gain geopolitical clout from the pandemic gave Braga justification to launch the propaganda campaign that Reuters uncovered, sources said.

Pork in the vaccine?

By summer 2020, the military’s propaganda campaign moved into new territory and darker messaging, ultimately drawing the attention of social media executives.

In regions beyond Southeast Asia, senior officers in the U.S. Central Command, which oversees military operations across the Middle East and Central Asia, launched their own version of the COVID psyop, three former military officials told Reuters.

Although the Chinese vaccines were still months from release, controversy roiled the Muslim world over whether the vaccines contained pork gelatin and could be considered “haram,” or forbidden under Islamic law. Sinovac has said that the vaccine was “manufactured free of porcine materials.” Many Islamic religious authorities maintained that even if the vaccines did contain pork gelatin, they were still permissible since the treatments were being used to save human life.

The Pentagon campaign sought to intensify fears about injecting a pig derivative. As part of an internal investigation at X, the social media company used IP addresses and browser data to identify more than 150 phony accounts that were operated from Tampa by U.S. Central Command and its contractors, according to an internal X document reviewed by Reuters.

“Can you trust China, which tries to hide that its vaccine contains pork gelatin and distributes it in Central Asia and other Muslim countries where many people consider such a drug haram?” read an April 2021 tweet sent from a military-controlled account identified by X.

The Pentagon also covertly spread its messages on Facebook and Instagram, alarming executives at parent company Meta who had long been tracking the military accounts, according to former military officials.

One military-created meme targeting Central Asia showed a pig made out of syringes, according to two people who viewed the image. Reuters found similar posts that traced back to U.S. Central Command. One shows a Chinese flag as a curtain separating Muslim women in hijabs and pigs stuck with vaccine syringes. In the center is a man with syringes; on his back is the word “China.” It targeted Central Asia, including Kazakhstan, Kyrgyzstan and Uzbekistan, a country that distributed tens of millions of doses of China’s vaccines and participated in human trials. Translated into English, the X post reads: “China distributes a vaccine made of pork gelatin.”

Facebook executives had first approached the Pentagon in the summer of 2020, warning the military that Facebook workers had easily identified the military’s phony accounts, according to three former U.S. officials and another person familiar with the matter. The government, Facebook argued, was violating Facebook’s policies by operating the bogus accounts and by spreading COVID misinformation.

The military argued that many of its fake accounts were being used for counterterrorism and asked Facebook not to take down the content, according to two people familiar with the exchange. The Pentagon pledged to stop spreading COVID-related propaganda, and some of the accounts continued to remain active on Facebook.

Nonetheless, the anti-vax campaign continued into 2021 as Biden took office.

Angered that military officials had ignored their warning, Facebook officials arranged a Zoom meeting with Biden’s new National Security Council shortly after the inauguration, Reuters learned. The discussion quickly became tense.

“It was terrible,” said a senior administration official describing the reaction after learning of the campaign’s pig-related posts. “I was shocked. The administration was pro-vaccine and our concern was this could affect vaccine hesitancy, especially in developing countries.”

By spring 2021, the National Security Council ordered the military to stop all anti-vaccine messaging. “We were told we needed to be pro-vaccine, pro all vaccines,” said a former senior military officer who helped oversee the program. Even so, Reuters found some anti-vax posts that continued through April and other deceptive COVID-related messaging that extended into that summer. Reuters could not determine why the campaign didn’t end immediately with the NSC’s order. In response to questions from Reuters, the NSC declined to comment.

The senior Defense Department official said that those complaints led to an internal review in late 2021, which uncovered the anti-vaccine operation. The probe also turned up other social and political messaging that was “many, many leagues away” from any acceptable military objective. The official would not elaborate.

The review intensified the following year, the official said, after a group of academic researchers at Stanford University flagged some of the same accounts as pro-Western bots in a public report. The high-level Pentagon review was first reported by the Washington Post. which also reported that the military used fake social media accounts to counter China’s message that COVID came from the United States. But the Post report did not reveal that the program evolved into the anti-vax propaganda campaign uncovered by Reuters.

The senior defense official said the Pentagon has rescinded parts of Esper’s 2019 order that allowed military commanders to bypass the approval of U.S. ambassadors when waging psychological operations. The rules now mandate that military commanders work closely with U.S. diplomats in the country where they seek to have an impact. The policy also restricts psychological operations aimed at “broad population messaging,” such as those used to promote vaccine hesitancy during COVID.

The Pentagon’s audit concluded that the military’s primary contractor handling the campaign, General Dynamics IT, had employed sloppy tradecraft, taking inadequate steps to hide the origin of the fake accounts, said a person with direct knowledge of the review. The review also found that military leaders didn’t maintain enough control over its psyop contractors, the person said.

A spokesperson for General Dynamics IT declined to comment.

Nevertheless, the Pentagon’s clandestine propaganda efforts are set to continue. In an unclassified strategy document last year, top Pentagon generals wrote that the U.S. military could undermine adversaries such as China and Russia using “disinformation spread across social media, false narratives disguised as news, and similar subversive activities [to] weaken societal trust by undermining the foundations of government.”

And in February, the contractor that worked on the anti-vax campaign – General Dynamics IT – won a $493 million contract. Its mission: to continue providing clandestine influence services for the military.

Permalink
June 15, 2024 at 1:13:18 PM GMT+2

Comment les mouvements politiques français jouent des techniques de manipulation de l'information sur les réseaux sociauxhttps://www.nextinpact.com/article/70132/comment-mouvements-politiques-francais-jouent-techniques-manipulation-information-sur-reseaux-sociaux

  • Politics
  • Societal Collapse
  • Media Manipulation
  • Big Data
  • Politics
  • Societal Collapse
  • Media Manipulation
  • Big Data

Comment les mouvements politiques français jouent des techniques de manipulation de l'information sur les réseaux sociaux

Sans parcimonie

Par Mathilde Saliou Le vendredi 14 octobre 2022 à 16:07

Pendant la campagne électorale 2022, l’équipe d’Éric Zemmour s’est démarquée par son industrialisation des techniques de manipulation de l’information sur les réseaux sociaux. Mais un regard sur les quelques années écoulées montre qu’à peu près toutes les forces politiques françaises utilisent ou ont utilisé des tactiques de distorsion des discours en ligne.

Le 10 avril 2022, à 20 heures, Emmanuel Macron et Marine Le Pen sortent vainqueurs du premier tour des élections présidentielles françaises, avec 27,85 et 23,15 % des voix respectivement. Derrière eux, Jean-Luc Mélenchon, 21,95 %, et Éric Zemmour, 7,07 % des voix. Ce dernier score tranche avec la surreprésentation du candidat d’extrême droite sur les réseaux sociaux pendant la campagne.

Une des explications possibles à l’échec du polémiste le jour du vote a commencé à être documentée dès février 2022, alors que Le Monde révélait l’usage que faisait l’équipe de Reconquête de techniques de distorsion des discours sur les réseaux sociaux : en ligne, une bonne partie de l’engouement pour l’ex-chroniqueur du Figaro était faussée.

Un rapport de l’Institute for Strategic Dialogue (ISD) vient détailler les ressorts du phénomène : dès janvier 2021, soit 18 mois avant le scrutin, le groupe Les Amis d’Éric Zemmour a publié des pétitions utilisées ensuite sur Twitter et Facebook pour tenter d’influencer le discours médiatique en faveur du candidat. Appelée astroturfing, la pratique va à l’encontre des conditions d’utilisation des grands réseaux sociaux, puisqu’elle constitue une activité dite « inauthentique ».

Si Reconquête s’est distingué par l'industrialisation de son usage, c'est loin d'être le seul mouvement à recourir à ces tactiques de manipulation difficiles à repérer pour l’internaute.

L’astroturfing, une technique ancienne

De même que la présence en ligne des politiques ne date pas d’hier – le site web du Front National, premier parti français à se lancer sur Internet, en 1996 – l’astroturfing date de bien avant l’avènement des réseaux sociaux. Issue d’un jeu de mots en anglais, l’expression oppose la marque de gazon artificiel AstroTurf au terme grassroot, « populaire », qui désigne aussi, littéralement, les racines d’un gazon.

En communication, cela consiste à monter de toutes pièces un mouvement pour faire croire à l’observateur extérieur à un engouement civique légitime. « *Ça existait avec le collage, l*’envoi en masse de courrier aux électeurs, souligne le chercheur Nicolas Vanderbiest, spécialiste des phénomènes d’influence en ligne. Simplement, le numérique et les réseaux sociaux ont rendu ces opérations beaucoup plus simples à réaliser. »

De fait, le chercheur décortique depuis une dizaine d’années le bruit en ligne pour en évaluer la pertinence. « Vers 2016-2017, j’ai vu mon environnement d’étude, Twitter, muter vers une tendance extrêmement polémique et militante ».

En France, les premiers signes en sont l’explosion du hashtag #TelAvivSurSeine, qui provoque rapidement des articles dans Le Monde, 20 Minutes ou l’Express quand bien même le sujet n’est poussé que par un très faible nombre de militants pro-palestiniens.

Un an plus tard, c’est le sujet du burkini qui est poussé par l’extrême droite jusqu’à être commenté sur les plateaux télé. Chaque fois, la logique est la même : quelques comptes tweetent abondamment sur un sujet précis, ils sont repris par un ou des journaux qui leur donnent une forme de crédibilité, puis la polémique enfle jusqu’à occuper tout l’espace.

Une tactique de désinformation en ligne courante…

Depuis, la pratique s’est répandue. Au tournant de l’élection de Donald Trump et du scandale Cambridge Analytica, des cas d’ingérence étrangère inquiètent. À deux jours du premier tour de la présidentielle 2017, la publication sur Wikileaks de 9 Go de données provenant du piratage d’En Marche! soulève les craintes d’une manipulation russe – lors de cette campagne, l’alt-right américaine a aussi tenté d’influencer les débats pour promouvoir Marine Le Pen.

Mais se tourner automatiquement vers des groupes étrangers pour analyser ces déformations serait se bercer d’illusion. Auteur de Toxic Data et créateur du Politoscope, qui analyse l’activité politique française sur Twitter, le mathématicien David Chavalarias a noté dès les débuts de la campagne de 2017 un mouvement d’amplification des discours anti-Macron et anti-Mélenchon orchestré par des comptes français pour privilégier les thématiques d’extrême droite. Le phénomène a touché jusqu’à l’UMP, puisque la brusque apparition du hashtag #AliJuppé, très principalement tweeté par la droite et l’extrême droite, a servi à déstabiliser les primaires de l’UMP et à pousser la candidature de François Fillon.

Déformer la discussion, que ce soit dans ou hors des périodes électorales, « tout le monde le fait, souffle Nicolas Vanderbiest. Et c’est parce que tout le monde le fait que chacun peut se défendre en disant "si je ne le fais pas, je ne survivrai pas". »

Effectivement, en 2018, le hashtag #BenallaGate suscite en quelques jours plus de tweets que le seul #BalanceTonPorc, une manipulation que Nicolas Vanderbiest détecte comme faussée – certains comptes proches du Front National tweetent jusqu’à 1 000 messages par heure, ce qui laisse supposer des pratiques automatisées.

En 2019, Le Monde et Mediapart montrent comment des militants marcheurs multiplient les faux comptes pour augmenter la visibilité des informations qui les intéressent ou harceler des « cibles ». En 2021, c’est sur les pratiques virulentes du Printemps Républicain que revient Slate. En 2022, les militants Insoumis organisent des raids pour faire grimper leur candidat dans les tendances…

… aux contours débattus

Si bien que, pour le spécialiste des médias sociaux Fabrice Epelboin, lorsqu’on parle d’un nombre réduit de militants qui s’organisent pour rendre visible un sujet qui les arrange, ce n’est même plus de l’astroturfing, « c’est devenu une tactique classique de militantisme. »

Pour lui, les pratiques consistant à reprendre et amplifier un message, tant qu’elles ne sont pas assistées de bots, de faux comptes et/ou de personnes payées pour amplifier le bruit comme dans l’affaire Avisa Partners, sont un nouveau mode d’action politique et non une déformation de l’usage des réseaux. Et les deux experts en la matière, parce qu’ils savent si bien « utiliser des discours clivants pour se propulser dans les discussions médiatiques, sont Éric Zemmour et Sandrine Rousseau » estime l’entrepreneur.

Sauf que la propension à cliver ne vient pas des seules forces politiques, elle est ancrée dans l’architecture des plateformes sociales. « Celles-ci sont construites pour favoriser les contenus sensationnels, promotionnels, qui divisent » rappelle David Chavalarias. En cinq ans, cela s’est traduit par une « polarisation nette des échanges pour observer, en 2022, un pôle d’extrême-droite et un autre autour de la gauche radicale » alors que toutes les couleurs politiques étaient représentées de manière relativement équilibrée en 2017.

Par ailleurs, les conditions d’utilisation des plateformes sont claires : chez Twitter, il est interdit d’utiliser le réseau « d’une manière qui vise à (…) amplifier artificiellement des informations, et d’adopter un comportement qui manipule ou perturbe l’expérience des utilisateurs ». Côté Meta, l’authenticité est déclarée « pierre angulaire de notre audience » et les usagers ont interdiction de « mentir sur leur identité sur Facebook, utiliser de faux comptes, augmenter de manière artificielle la popularité de leur contenu ».

L’architecture numérique source d’oppositions

Co-autrice du rapport de l’ISD, la coordinatrice de recherche Zoé Fourel note pourtant que lesdites plateformes n’ont absolument pas réagi aux violations de leurs règles par les militants proches d’Éric Zemmour. L’immense majorité des tweets et publications qui ont permis de propulser en trending topic (sujet tendance sur Twitter) étaient non pas le fait d’une foule de citoyens engagés, mais d’un minuscule nombre de profils sur le réseau social – dans un cas sur dix, c’était le responsable de la stratégie numérique Samuel Lafont qui twittait lui-même les contenus destinés à attirer l’attention du public et des médias.

Et cela a fonctionné un temps : en septembre 2021, comptait Acrimed, l’éditorialiste multi-condamné pour provocation à la haine raciale était cité 4 167 fois dans la presse française, soit 139 fois par jour. En janvier 2022, les sondages lui annonçaient 14 % des voix d’électeurs.

Ce que les médias et les internautes doivent comprendre, estime David Chavalarias, c’est à quel point « les plateformes sociales ont un effet structurel sur les interactions sociales elles-mêmes : non seulement elles prennent vos données, mais elles façonnent aussi la discussion et les interactions. »

Cela finit par créer des stratégies d’influence à part entière, indique le chercheur : « promouvoir des idées aussi clivantes que la théorie du grand remplacement ou l’existence d’un islamo-gauchisme, c’est forcer le positionnement de l’internaute dans un camp : celui du pour ou celui du contre ». Par ailleurs, des chercheuses comme Jen Schradie ont montré la tendance des plateformes à favoriser les idées conservatrices, ce qu’un rapport interne à Twitter est venu confirmer fin 2021. L’architecture de nos arènes numériques, conclut David Chavalarias, « a pour effet de simplifier le business politique pour le rendre bipolaire. »

Que faire, face à ces phénomènes dont on commence tout juste à prendre la mesure ? Dans le discours politique, une réaction pourrait venir des partis et des militants eux-mêmes. L’équipe de Joe Biden, aux États-Unis, puis celle d’Emmanuel Macron, en France, ont adopté de nouvelles stratégies dans les campagnes présidentielles récentes : celle de ne plus communiquer, sur Twitter, que sur des éléments positifs (c’est-à-dire peu ou non clivants) et actions de leurs candidats. Ce faisant, ils s’éloignent de la machine à clash instituée par le réseau social.

« Il faudrait que les plateformes commencent par implémenter leurs propres règles », pointe par ailleurs Zoé Fourel, qui plaide pour une ouverture de leurs données pour faciliter le travail des chercheurs et les audits extérieurs.

« Ajouter des étiquettes sur l’activité suspectée d’illégitimité pourrait aussi aider les utilisateurs à s’y retrouver. Sans parler du besoin de coopération entre plateformes : quand une campagne est menée sur Twitter, elle a aussi des échos sur Facebook et ailleurs en ligne ». La chercheuse suggère de reproduire les partenariats existant pour lutter contre certains contenus extrêmes.

Permalink
June 24, 2023 at 2:40:39 PM GMT+2
Links per page
  • 20
  • 50
  • 100
130 shaares · Shaarli · The personal, minimalist, super-fast, database free, bookmarking service by the Shaarli community · Documentation · Theme : Stack · Font : DINish
Fold Fold all Expand Expand all Are you sure you want to delete this link? Are you sure you want to delete this tag? The personal, minimalist, super-fast, database free, bookmarking service by the Shaarli community