Delete Set public Set private Add tags Delete tags
  Add tag   Cancel
  Delete tag   Cancel
  • Cyberpunk is the way
  • My links
  • Tags
  • pics
  • Daily
  • RSS
  • Login
Filter untagged links
4 results tagged Disinformation War

Comment la Russie influence les élections en France grâce à une armée de botshttps://www.futura-sciences.com/tech/actualites/guerre-futur-russie-influence-elections-france-grace-armee-bots-114424/

  • War
  • Disinformation War
  • Media Manipulation
  • Social Network
  • PsyOps
  • War
  • Disinformation War
  • Media Manipulation
  • Social Network
  • PsyOps

Comment la Russie influence les élections en France grâce à une armée de bots

L’étude du docteur Chavalarias montre que les campagnes d’influence sur X cherchent à affaiblir le front républicain au profit de l’extrême droite. © Login, Adobe Stock (image générée avec IA)

Une étude d’un chercheur du CNRS révèle les manœuvres de déstabilisation menées par le Kremlin sur les réseaux sociaux pour faire monter l’extrême droite en France. Les narratifs poussés par la Russie suffisent-ils pour manipuler l’opinion des électeurs ?

par Sylvain Biget le 5 juillet 2024 pour Futura-science

« La goutte d'eau creuse la pierre, non par force, mais en tombant souvent. » Cette ancienne devise du KGB reste d'actualité avec un président Poutine élu à vie et qui mise sur le long terme pour influencer par petits pas les élections dans les démocraties occidentales. Le champ de bataille des manœuvres d'ingérence russes se situe sur les réseaux sociaux, et cela ne date pas d'hier. L'idée motrice consiste à affaiblir l'Union européenne et l'Otan en manipulant les opinions des populations des pays membres.

Pour y parvenir, le Kremlin cherche à aider à élire des dirigeants moins hostiles au régime russe. C'est le cas de l'extrême droite française et notamment du RN qui a bénéficié d'un emprunt russe pour ses campagnes et s'est systématiquement opposé ou abstenu dès qu'il s'agissait de soutenir la résistance ukrainienne ou d'infliger des sanctions à la Russie suite à l'invasion de l'Ukraine.

La tactique est connue depuis longtemps, puisque dès les élections présidentielles de 2017, des opérations de piratage des boîtes e-mail de l'équipe de campagne d'Emmanuel Macron avaient été menées pour tenter de discréditer le candidat. La manœuvre était alors soutenue par des armées de bots menant des campagnes intensives d'astrosurfing pour amplifier un narratif favorable à l'élection de Marine Le Pen. Aujourd'hui, une récente étude, menée par le chercheur du CNRS David Chavalarias, analyse les techniques de déstabilisation employées par le Kremlin pour manipuler l'opinion des électeurs afin qu'ils votent pour le RN dans le cadre des élections législatives anticipées.

L'auteur explique que déjà, dès 2023, lors des campagnes pour les élections européennes, des publicités ciblées ont été achetées sur Facebook pour pousser les messages allant à l'encontre des gouvernements, ou le soutien des pays occidentaux à la résistance ukrainienne. Ces campagnes se sont amplifiées à l'approche des élections. Par exemple, de fausses publicités de recrutement de soldats pour l'armée française afin d'aller combattre en Ukraine ont abondamment circulé. Elles visaient à renforcer les propos d'Emmanuel Macron sur l'envoi de troupes en Ukraine pour le faire passer pour un va-t-en-guerre.

La représentation graphique des communautés politiques. En nuances de rouge, les partis liés à la gauche. Ceux de droite sont représentés en bleu. Les filaments représentent les flux d’échanges et de partages de publications entre comptes X. © CNRS

La représentation graphique des communautés politiques. En nuances de rouge, les partis liés à la gauche. Ceux de droite sont représentés en bleu. Les filaments représentent les flux d’échanges et de partages de publications entre comptes X. © CNRS

Un pouvoir de nuisance surestimé ?

Mais c'est sur le réseau X que les manœuvres des bots russes et des propagateurs de narratifs prorusses sont les plus abondantes. Depuis 2016, une armée de faux comptes diffuse régulièrement des concepts reposant sur des mots-clés poussant au clivage. C'est ainsi que le terme « islamo-gauchiste », qui venait de nulle part, a été propulsé et a été repris par des ministres et fait l'objet de débats.

Mais plus globalement, selon le chercheur, le Kremlin a employé trois stratégies : pousser la normalisation de l'extrême droite, faire en sorte que les partis du front républicain ne puissent plus s'entendre et surtout engendrer le rejet des partis modérés par les électeurs afin qu'ils votent à l'extrême droite. Pour y parvenir, les bots russes surfent aussi sur l'actualité et notamment la guerre à Gaza en diffusant des images horribles de l'attaque du Hamas du 7 octobre.

Une façon de faire monter l'islamophobie, pointer du doigt l'antisémitisme de certains partis et augmenter les discours radicaux entre l'extrême droite et l'extrême gauche. Pour accentuer l'effet, des bots faisant la promotion de l'islamisme politique ont été créés. Ces manœuvres, qui sont toujours en cours, visent toujours à amplifier l'adhésion au RN d'un maximum d'électeurs pour le second tour des législatives de ce dimanche.

Si l'analyse du chercheur est solide, reste à savoir si cette stratégie du Kremlin pèse vraiment sur le choix des électeurs. Il n'existe pour l'heure aucune étude aboutie sur les effets concrets auprès du public de ce genre de manipulation. Sans pour autant sous-estimer le pouvoir de ces opérations, rien que la montée en puissance du réseau de médias conservateurs du groupe Bolloré peut expliquer bien des choses.

Permalink
July 7, 2024 at 4:05:53 PM GMT+2

Deluge of ‘pink slime’ websites threaten to drown out truth with fake news in US election | US elections 2024 | The Guardianhttps://www.theguardian.com/us-news/article/2024/jun/20/fake-news-websites-us-election

  • Politics
  • Artificial intelligence
  • Disinformation War
  • Fakeness
  • Politics
  • Artificial intelligence
  • Disinformation War
  • Fakeness

Deluge of ‘pink slime’ websites threaten to drown out truth with fake news in US election

US sites pushing misinformation are proliferating, aiming to look like reliable sources as local newspapers close down

Eric Berger Thu 20 Jun 2024 12.00 CEST

Political groups on the right and left are using fake news websites designed to look like reliable sources of information to fill the void left by the demise of local newspapers, raising fears of the impact that they might have during the United States’ bitterly fought 2024 election.

Some media experts are concerned that the so-called pink slime websites, often funded domestically, could prove at least as harmful to political discourse and voters’ faith in media and democracy as foreign disinformation efforts in the 2016 and 2020 presidential elections.

According to a recent report from NewsGuard, a company that aims to counter misinformation by studying and rating news websites, the websites are so prolific that “the odds are now better than 50-50 that if you see a news website purporting to cover local news, it’s fake.”

NewsGuard estimates that there are a staggering 1,265 such fake local news websites in the US – 4% more than the websites of 1,213 daily newspapers left operating in the country.

“Actors on both sides of the political spectrum” feel “that what they are doing isn’t bad because all media is really biased against their side or that that they know actors on the other side are using these tactics and so they feel they need to,” said Matt Skibinski, general manager of NewsGuard, which determined that such sites now outnumber legitimate local news organizations. “It’s definitely contributed to partisanship and the erosion of trust in media; it’s also a symptom of those things.”

Pink slime websites, named after a meat byproduct, started at least as early as 2004 when Brian Timpone, a former television reporter who described himself as a “biased guy” and a Republican, started funding websites featuring names of cities, towns and regions like the Philly Leader and the South Alabama Times.

Timpone’s company, Metric Media, now operates more than 1,000 such websites and his private equity company receives funding from conservative political action committees, according to NewsGuard.

The Leader recently ran a story with the headline, “Rep Evans votes to count illegal aliens towards seats in Congress.”

In actuality, Representative Dwight Evans, a Democrat, did not vote to start counting undocumented immigrants in the 2030 census but rather against legislation that would have changed the way the country has conducted apportionment since 1790.

That sort of story is “standard practice for these outlets”, according to Tim Franklin, who leads Northwestern University’s Local News Initiative, which researches the industry.

“They will take something that maybe has just a morsel of truth to it and then twist it with their own partisan or ideological spin,” Franklin said. “They also tend to do it on issues like immigration or hot-button topics that they think will elicit an emotional response.”

A story published this month on the NW Arkansas News site had a headline on the front page that reported that the unemployment rate in 2021 in Madison county was 5.1% – even though there is much more recent data available. In April 2024, the local unemployment rate was 2.5%.

“Another tactic that we have seen across many of this category of sites is taking a news story that happened at some point and presenting it as if it just happened now, in a way that is misleading,” Skibinski said.

The left has also created websites designed to look like legitimate news organizations but actually shaped by Democratic supporters.

The liberal Courier Newsroom network operates websites in Arizona, Florida, Iowa, Michigan and Nevada, among other states, that – like the conservative pink slime sites – have innocuous sounding names like the Copper Courier and Up North News. The Courier has runs stories like “Gov Ducey Is Now the Most Unpopular Governor in America,” referring to Doug Ducy, the former Republican Arizona governor.

“In contrast, coverage of Democrats, including US President Joe Biden, Democratic Arizona Gov Katie Hobbs, and US Sen Mark Kelly of Arizona, is nearly always laudatory,” NewsGuard stated in a report about Courier coverage.

Tara McGowan, a Democratic strategist who founded the Courier Newsroom has received funding from liberal donors like Reid Hoffman and George Soros, as well as groups associated with political action committees, according to NewsGuard.

“There are pink slime operations on both the right and the left. To me, the key is disclosure and transparency about ownership,” said Franklin.

In a statement, a spokesperson for the Courier said comparisons between its operations and rightwing pink slime groups were unfair and criticized NewsGuard’s methodology in comparing the two.

“Courier publishes award-winning, factual local news by talented journalists who live in the communities we cover, and our reporting is often cited by legacy media outlets. This is in stark contrast to the pink slime networks that pretend to have a local presence but crank out low-quality fake news with no bylines and no accountability. Courier is proudly transparent about our pro-democracy values, and we carry on the respected American tradition of advocacy journalism,” the spokesperson said.

While both the left and the right have invested in the pink slime websites, there are differences in the owners’ approaches, according to Skibinski.

The right-wing networks have created more sites “that are probably getting less attention per site, and on the left, there is a smaller number of sites, but they are more strategic about getting attention to those sites on Facebook and elsewhere”, Skibinski said. “I don’t know that we can quantify whether one is more impactful than the other.”

Artificial intelligence could also help site operators quickly generate stories and create fake images.

“The technology underlying artificial intelligence is now becoming more accessible to malign actors,” said Kathleen Hall Jamieson, a University of Pennsylvania communications professor and director of the Annenberg Public Policy Center, which publishes Factcheck.org. “The capacity to create false images is very high, but also there is a capacity to detect the images that is emerging very rapidly. The question is, will it emerge rapidly with enough capacity?”

Still, it’s not clear whether these websites are effective. Stanford University reported in a 2023 study that engagement with pink slime websites was “relatively low” and little evidence that living “in a news desert made people more likely to consume pink slime”.

The Philly Leader and the NW Arkansas News both only have links to Facebook accounts on their websites and have less than 450 followers on each. Meanwhile, the Copper Courier and Up North News have accounts on all the major platforms and a total of about 150,000 followers on Facebook.

Franklin said he thinks that a lot of people don’t actually click links on social media posts to visit the website.

“The goal of some of these operators is not to get traffic directly to their site, but it’s to go viral on social media,” he said.

Republican lawmakers and leaders of the conservative news sites the Daily Wire and the Federalist have also filed a lawsuit and launched investigations accusing NewsGuard of helping the federal government censor right-leaning media. The defense department hired the company strictly to counter “disinformation efforts by Russian, Chinese and Iranian government-linked operations targeting Americans and our allies”, Gordon Crovitz, the former Wall Street Journal publisher who co-founded NewsGuard, told the Hill in response to a House oversight committee investigation. “We look forward to clarifying the misunderstanding by the committee about our work for the Defense Department.”

To counter the flood of misinformation, social media companies must take a more active role in monitoring such content, according to Franklin and Skibinski.

“The biggest solution to this kind of site would be for the social media platforms to take more responsibility in terms of showing context to the user about sources that could be their own context. It could be data from third parties, like what we do,” said Skibinski.

Franklin would like to see a national media literacy campaign. States around the country have passed laws requiring such education in schools.

Franklin also hopes that legitimate local news could rebound. The MacArthur Foundation and other donors last year pledged $500m to help local outlets.

“I actually have more optimism now than I had a few years ago,” Franklin said. “We’re in the midst of historic changes in how people consume news and how it’s produced and how it’s distributed and how it’s paid for, but I think there’s still demand for local news, and that’s kind of where it all starts.”

Permalink
June 25, 2024 at 8:43:55 PM GMT+2

Réseaux sociaux : la fabrique de l’hostilité politique ?https://theconversation.com/reseaux-sociaux-la-fabrique-de-lhostilite-politique-230458

  • Social Network
  • Disinformation War
  • Police State
  • Fakeness
  • Social Network
  • Disinformation War
  • Police State
  • Fakeness

Réseaux sociaux : la fabrique de l’hostilité politique ?

Publié: 17 juin 2024, 15:21 CEST

Depuis quelques années, les réseaux sociaux comme Facebook et X (anciennement Twitter) sont devenus la cible d’accusations nombreuses : facteurs de diffusion de « fake news » à grande échelle, instruments de déstabilisation des démocraties par la Russie et la Chine, machines à capturer notre attention pour la vendre à des marchands de toutes sortes, théâtres d’un ciblage publicitaire toujours plus personnalisé et manipulateur, etc. En atteste le succès de documentaires et d’essais sur le coût humain, jugé considérable, des réseaux sociaux, comme The Social Dilemma sur Netflix.

L’un de ces discours, en particulier, rend les plates-formes digitales et leurs algorithmes responsables de l’amplification de l’hostilité en ligne et de la polarisation politique dans la société. Avec les discussions en ligne anonymes, affirment certains, n’importe qui serait susceptible de devenir un troll, c’est-à-dire une personne agressive, cynique et dépourvue de compassion, ou de se « radicaliser ».

Des travaux récents en sciences sociales quantitatives et en psychologie scientifique permettent toutefois d’apporter quelques correctifs à ce récit, excessivement pessimiste.

L’importance du contexte sociopolitique et de la psychologie

Pour commencer, plusieurs études suggèrent que si les individus font régulièrement l’expérience de discussions sur des sujets politiques qui deviennent conflictuelles, cette incivilité est en partie liée à des facteurs psychologiques et socio-économiques qui préexistent aux plates-formes digitales.

Dans une étude interculturelle de grande envergure, nous avons interrogé plus de 15 000 personnes via des panels représentatifs dans trente nations très diverses (France, Irak, Thaïlande, Pakistan, etc.) sur leurs expériences des conversations sur Internet. Notre première découverte est que ce sont dans les pays les plus inégalitaires économiquement et les moins démocratiques que les individus sont le plus souvent l’objet d’invectives hostiles de la part de leurs concitoyens sur les réseaux (comme en Turquie ou au Brésil). Ce phénomène découle manifestement des frustrations générées par ces sociétés plus répressives des aspirations individuelles.

Notre étude montre en outre que les individus qui s’adonnent le plus à l’hostilité en ligne sont aussi ceux qui sont les plus disposés à la recherche de statut social par la prise de risque. Ce trait de personnalité correspond à une orientation vers la dominance, c’est-à-dire à chercher à soumettre les autres à sa volonté (y compris par l’intimidation). Dans nos données interculturelles, nous observons que les individus ayant ce type de traits dominants sont nombreux dans les pays inégalitaires et non démocratiques. Des analyses indépendantes montrent d’ailleurs que la dominance est un élément clé de la psychologie de la conflictualité politique, puisqu’elle prédit également davantage de partage de ‘fake news’ moquant ou insultant les opposants politiques sur Internet, et plus d’attrait pour le conflit politique hors ligne, notamment.

Répliquant une étude antérieure, nous trouvons par ailleurs que ces individus motivés par la recherche de statut par la prise de risque, qui admettent le plus se comporter de manière hostile sur Internet, sont aussi ceux qui sont plus susceptibles d’interagir de manière agressive ou toxique dans des discussions en face à face (la corrélation entre l’hostilité en ligne et hors ligne est forte, de l’ordre de β = 0,77).

En résumé, l’hostilité politique en ligne semble largement être le fruit de personnalités particulières, rendues agressives par les frustrations engendrées par des contextes sociaux inégalitaires, et activant notre tendance à voir le monde en termes de “nous” vs « eux ». Au plan politique, réduire les disparités de richesses entre groupes et rendre nos institutions plus démocratiques constituent des objectifs probablement incontournables si nous souhaitons faire advenir un Internet (et une société civile) plus harmonieux.

Les réseaux : prismes exagérant l’hostilité ambiante

Si notre étude replace l’hostilité politique en ligne dans un plus large contexte, elle ne nie pas tout rôle aux plates-formes dans la production de la polarisation politique pour autant.

Les réseaux sociaux permettent à un contenu d’être diffusé à l’identique à des millions de personnes (à l’inverse de la communication verbale, lieu de distorsions inévitables). À ce titre, ils peuvent mésinformer ou mettre en colère des millions de personnes à un très faible coût. Ceci est vrai que l’information fausse ou toxique soit créée intentionnellement pour générer des clics, ou qu’elle soit le fruit involontaire des biais politiques d’un groupe politique donné.

[Déjà plus de 120 000 abonnements aux newsletters The Conversation. Et vous ? Abonnez-vous aujourd’hui pour mieux comprendre les grands enjeux du monde.]

Si les échanges sur les réseaux sociaux manquent souvent de civilité, c’est également à cause de la possibilité qu’ils offrent d’échanger avec des étrangers anonymes, dépersonnalisés. Cette expérience unique à l’ère Internet réduit le sentiment de responsabilité personnelle, ainsi que l’empathie vis-à-vis d’interlocuteurs que nous ne voyons plus comme des personnes mais comme les membres interchangeables de « tribus » politiques.

Des analyses récentes rappellent par ailleurs que les réseaux sociaux – comme le journalisme, à bien des égards – opèrent moins comme le miroir que comme le prisme déformant de la diversité des opinions dans la société.

Les posts politiques indignés et potentiellement insultants sont souvent le fait de personnes plus déterminées à s’exprimer et radicales que la moyenne – que ce soit pour signaler leurs engagements, exprimer une colère, faire du prosélytisme, etc. Même lorsqu’ils représentent une assez faible proportion de la production écrite sur les réseaux, ces posts se trouvent promus par des algorithmes programmés pour mettre en avant les contenus capables d’attirer l’attention et de déclencher des réponses, dont les messages clivants font partie.

À contrario, la majorité des utilisateurs, plus modérée et moins péremptoire, est réticente à se lancer dans des discussions politiques qui récompensent rarement la bonne foi argumentative et qui dégénèrent souvent en « shitstorms » (c.-à-d., en déchaînements de haine).

Ces biais de sélection et de perception produisent l’impression trompeuse que les convictions radicales et hostiles sont à la fois plus répandues et tolérées moralement qu’elles ne le sont en réalité.

Quand l’exposition à la différence énerve

Ceci étant dit, l’usage des réseaux sociaux semble pouvoir contribuer à augmenter l’hostilité et la radicalité politiques selon un mécanisme au moins : celui de l’exposition à des versions caricaturales et agressives des positions politiques adverses, qui agacent.

Contrairement à une croyance répandue, la plupart de nos connexions virtuelles ne prennent typiquement pas vraiment la forme de « chambres d’écho », nous isolant dans des sas d’idées politiques totalement homogènes.

Bien que certains réseaux soient effectivement construits de cette manière (4Chan ou certains sub-Reddits), les plus larges plates-formes que sont Facebook (3 milliards d’utilisateurs) et X (550 millions) nous font typiquement défiler une certaine diversité d’opinions devant les yeux. Celle-ci est en tous cas fréquemment supérieure à celle de nos relations amicales : êtes-vous encore régulièrement en contact avec des copains de collège qui ont « viré Front national » ? Probablement pas, mais il est plus probable que vous lisiez leurs posts Facebook.

Cette exposition à l’altérité idéologique est désirable, en théorie, puisqu’elle devrait permettre de nous faire découvrir les angles morts de nos connaissances et convictions politiques, notre commune humanité, et donc nous rendre à la fois plus humbles et plus respectueux les uns des autres. Malheureusement, le mode sur lequel la plupart des gens expriment leurs convictions politiques – sur les réseaux comme à la machine à café – est assez dépourvu de nuance et de pédagogie. Il tend à réduire les positions adverses à des caricatures diabolisées, et cherche moins à persuader le camp d’en face qu’à galvaniser les personnes qui sont déjà d’accord avec soi, ou à se faire bien voir d’amis politiques.

Prenant appui sur des études expérimentales déployées sur Twitter et des interviews de militants démocrates et républicains menées avec son équipe, le sociologue Chris Bail nous avertit dans son livre Le prisme des réseaux sociaux. D’après lui, une exposition répétée à des contenus peu convaincants et moqueurs produits par nos ennemis politiques peut paradoxalement renforcer les partisans dans leurs positions et identités préexistantes, plutôt que de les rapprocher intellectuellement et émotionnellement les uns des autres.

Cependant, cette relation entre usage des réseaux sociaux et polarisation politique pourrait dépendre beaucoup du temps d’exposition et n’apparaît pas dans tous les échantillons étudiés. Ainsi, des études explorant les effets d’un arrêt de l’utilisation de Facebook et d’Instagram n’observent pas que l’utilisation de ces médias sociaux polarise de façon détectable les opinions politiques des utilisateurs.

Rappelons-nous toujours que les discours pointant des menaces pesant sur la société jouissent d’un avantage concurrentiel considérable sur le marché des idées et des conversations, en raison de leur attractivité pour nos esprits. Il convient donc d’approcher la question des liens entre réseaux sociaux, hostilité et polarisation politique avec nuance, en évitant les travers symétriques de l’optimisme béat et de la panique collective.

Permalink
June 17, 2024 at 10:18:17 PM GMT+2

Pentagon ran secret anti-vax campaign to incite fear of China vaccineshttps://www.reuters.com/investigates/special-report/usa-covid-propaganda/

  • Disinformation War
  • PsyOps
  • Social Network
  • Media Manipulation
  • Disinformation War
  • PsyOps
  • Social Network
  • Media Manipulation

Pentagon ran secret anti-vax campaign to undermine China during pandemic

The U.S. military launched a clandestine program amid the COVID crisis to discredit China’s Sinovac inoculation – payback for Beijing’s efforts to blame Washington for the pandemic. One target: the Filipino public. Health experts say the gambit was indefensible and put innocent lives at risk.

By CHRIS BING and JOEL SCHECTMAN Filed June 14, 2024, 9:45 a.m. GMT

At the height of the COVID-19 pandemic, the U.S. military launched a secret campaign to counter what it perceived as China’s growing influence in the Philippines, a nation hit especially hard by the deadly virus.

The clandestine operation has not been previously reported. It aimed to sow doubt about the safety and efficacy of vaccines and other life-saving aid that was being supplied by China, a Reuters investigation found. Through phony internet accounts meant to impersonate Filipinos, the military’s propaganda efforts morphed into an anti-vax campaign. Social media posts decried the quality of face masks, test kits and the first vaccine that would become available in the Philippines – China’s Sinovac inoculation.

Reuters identified at least 300 accounts on X, formerly Twitter, that matched descriptions shared by former U.S. military officials familiar with the Philippines operation. Almost all were created in the summer of 2020 and centered on the slogan #Chinaangvirus – Tagalog for China is the virus.

“COVID came from China and the VACCINE also came from China, don’t trust China!” one typical tweet from July 2020 read in Tagalog. The words were next to a photo of a syringe beside a Chinese flag and a soaring chart of infections. Another post read: “From China – PPE, Face Mask, Vaccine: FAKE. But the Coronavirus is real.”

After Reuters asked X about the accounts, the social media company removed the profiles, determining they were part of a coordinated bot campaign based on activity patterns and internal data.

The U.S. military’s anti-vax effort began in the spring of 2020 and expanded beyond Southeast Asia before it was terminated in mid-2021, Reuters determined. Tailoring the propaganda campaign to local audiences across Central Asia and the Middle East, the Pentagon used a combination of fake social media accounts on multiple platforms to spread fear of China’s vaccines among Muslims at a time when the virus was killing tens of thousands of people each day. A key part of the strategy: amplify the disputed contention that, because vaccines sometimes contain pork gelatin, China’s shots could be considered forbidden under Islamic law.

The military program started under former President Donald Trump and continued months into Joe Biden’s presidency, Reuters found – even after alarmed social media executives warned the new administration that the Pentagon had been trafficking in COVID misinformation. The Biden White House issued an edict in spring 2021 banning the anti-vax effort, which also disparaged vaccines produced by other rivals, and the Pentagon initiated an internal review, Reuters found.

“I don’t think it’s defensible. I’m extremely dismayed, disappointed and disillusioned to hear that the U.S. government would do that.”

Daniel Lucey, infectious disease specialist at Dartmouth’s Geisel School of Medicine.

The U.S. military is prohibited from targeting Americans with propaganda, and Reuters found no evidence the Pentagon’s influence operation did so.

Spokespeople for Trump and Biden did not respond to requests for comment about the clandestine program.

A senior Defense Department official acknowledged the U.S. military engaged in secret propaganda to disparage China’s vaccine in the developing world, but the official declined to provide details.

A Pentagon spokeswoman said the U.S. military “uses a variety of platforms, including social media, to counter those malign influence attacks aimed at the U.S., allies, and partners.” She also noted that China had started a “disinformation campaign to falsely blame the United States for the spread of COVID-19.”

In an email, the Chinese Ministry of Foreign Affairs said that it has long maintained the U.S. government manipulates social media and spreads misinformation.

Manila’s embassy in Washington did not respond to Reuters inquiries, including whether it had been aware of the Pentagon operation. A spokesperson for the Philippines Department of Health, however, said the “findings by Reuters deserve to be investigated and heard by the appropriate authorities of the involved countries.” Some aide workers in the Philippines, when told of the U.S. military propaganda effort by Reuters, expressed outrage.

Briefed on the Pentagon’s secret anti-vax campaign by Reuters, some American public health experts also condemned the program, saying it put civilians in jeopardy for potential geopolitical gain. An operation meant to win hearts and minds endangered lives, they said.

“I don’t think it’s defensible,” said Daniel Lucey, an infectious disease specialist at Dartmouth’s Geisel School of Medicine. “I’m extremely dismayed, disappointed and disillusioned to hear that the U.S. government would do that,” said Lucey, a former military physician who assisted in the response to the 2001 anthrax attacks.

The effort to stoke fear about Chinese inoculations risked undermining overall public trust in government health initiatives, including U.S.-made vaccines that became available later, Lucey and others said. Although the Chinese vaccines were found to be less effective than the American-led shots by Pfizer and Moderna, all were approved by the World Health Organization. Sinovac did not respond to a Reuters request for comment.

Academic research published recently has shown that, when individuals develop skepticism toward a single vaccine, those doubts often lead to uncertainty about other inoculations. Lucey and other health experts say they saw such a scenario play out in Pakistan, where the Central Intelligence Agency used a fake hepatitis vaccination program in Abbottabad as cover to hunt for Osama bin Laden, the terrorist mastermind behind the attacks of September 11, 2001. Discovery of the ruse led to a backlash against an unrelated polio vaccination campaign, including attacks on healthcare workers, contributing to the reemergence of the deadly disease in the country.

“It should have been in our interest to get as much vaccine in people’s arms as possible,” said Greg Treverton, former chairman of the U.S. National Intelligence Council, which coordinates the analysis and strategy of Washington’s many spy agencies. What the Pentagon did, Treverton said, “crosses a line.”

‘We were desperate’

Together, the phony accounts used by the military had tens of thousands of followers during the program. Reuters could not determine how widely the anti-vax material and other Pentagon-planted disinformation was viewed, or to what extent the posts may have caused COVID deaths by dissuading people from getting vaccinated.

In the wake of the U.S. propaganda efforts, however, then-Philippines President Rodrigo Duterte had grown so dismayed by how few Filipinos were willing to be inoculated that he threatened to arrest people who refused vaccinations.

“You choose, vaccine or I will have you jailed,” a masked Duterte said in a televised address in June 2021. “There is a crisis in this country … I’m just exasperated by Filipinos not heeding the government.”

When he addressed the vaccination issue, the Philippines had among the worst inoculation rates in Southeast Asia. Only 2.1 million of its 114 million citizens were fully vaccinated – far short of the government’s target of 70 million. By the time Duterte spoke, COVID cases exceeded 1.3 million, and almost 24,000 Filipinos had died from the virus. The difficulty in vaccinating the population contributed to the worst death rate in the region.

A spokesperson for Duterte did not make the former president available for an interview.

Some Filipino healthcare professionals and former officials contacted by Reuters were shocked by the U.S. anti-vax effort, which they say exploited an already vulnerable citizenry. Public concerns about a Dengue fever vaccine, rolled out in the Philippines in 2016, had led to broad skepticism toward inoculations overall, said Lulu Bravo, executive director of the Philippine Foundation for Vaccination. The Pentagon campaign preyed on those fears.

“Why did you do it when people were dying? We were desperate,” said Dr. Nina Castillo-Carandang, a former adviser to the World Health Organization and Philippines government during the pandemic. “We don’t have our own vaccine capacity,” she noted, and the U.S. propaganda effort “contributed even more salt into the wound.”

The campaign also reinforced what one former health secretary called a longstanding suspicion of China, most recently because of aggressive behavior by Beijing in disputed areas of the South China Sea. Filipinos were unwilling to trust China’s Sinovac, which first became available in the country in March 2021, said Esperanza Cabral, who served as health secretary under President Gloria Macapagal Arroyo. Cabral said she had been unaware of the U.S. military’s secret operation.

“I’m sure that there are lots of people who died from COVID who did not need to die from COVID,” she said.

To implement the anti-vax campaign, the Defense Department overrode strong objections from top U.S. diplomats in Southeast Asia at the time, Reuters found. Sources involved in its planning and execution say the Pentagon, which ran the program through the military’s psychological operations center in Tampa, Florida, disregarded the collateral impact that such propaganda may have on innocent Filipinos.

“We weren’t looking at this from a public health perspective,” said a senior military officer involved in the program. “We were looking at how we could drag China through the mud.”

A new disinformation war

In uncovering the secret U.S. military operation, Reuters interviewed more than two dozen current and former U.S officials, military contractors, social media analysts and academic researchers. Reporters also reviewed Facebook, X and Instagram posts, technical data and documents about a set of fake social media accounts used by the U.S. military. Some were active for more than five years.

Clandestine psychological operations are among the government’s most highly sensitive programs. Knowledge of their existence is limited to a small group of people within U.S. intelligence and military agencies. Such programs are treated with special caution because their exposure could damage foreign alliances or escalate conflict with rivals.

Over the last decade, some U.S. national security officials have pushed for a return to the kind of aggressive clandestine propaganda operations against rivals that the United States’ wielded during the Cold War. Following the 2016 U.S. presidential election, in which Russia used a combination of hacks and leaks to influence voters, the calls to fight back grew louder inside Washington.

In 2019, Trump authorized the Central Intelligence Agency to launch a clandestine campaign on Chinese social media aimed at turning public opinion in China against its government, Reuters reported in March. As part of that effort, a small group of operatives used bogus online identities to spread disparaging narratives about Xi Jinping’s government.

COVID-19 galvanized the drive to wage psychological operations against China. One former senior Pentagon leader described the pandemic as a “bolt of energy” that finally ignited the long delayed counteroffensive against China’s influence war.

The Pentagon’s anti-vax propaganda came in response to China’s own efforts to spread false information about the origins of COVID. The virus first emerged in China in late 2019. But in March 2020, Chinese government officials claimed without evidence that the virus may have been first brought to China by an American service member who participated in an international military sports competition in Wuhan the previous year. Chinese officials also suggested that the virus may have originated in a U.S. Army research facility at Fort Detrick, Maryland. There’s no evidence for that assertion.

Mirroring Beijing’s public statements, Chinese intelligence operatives set up networks of fake social media accounts to promote the Fort Detrick conspiracy, according to a U.S. Justice Department complaint.

China’s messaging got Washington’s attention. Trump subsequently coined the term “China virus” as a response to Beijing’s accusation that the U.S. military exported COVID to Wuhan.

“That was false. And rather than having an argument, I said, ‘I have to call it where it came from,’” Trump said in a March 2020 news conference. “It did come from China.”

China’s Foreign Ministry said in an email that it opposed “actions to politicize the origins question and stigmatize China.” The ministry had no comment about the Justice Department’s complaint.

Beijing didn’t limit its global influence efforts to propaganda. It announced an ambitious COVID assistance program, which included sending masks, ventilators and its own vaccines – still being tested at the time – to struggling countries. In May 2020, Xi announced that the vaccine China was developing would be made available as a “global public good,” and would ensure “vaccine accessibility and affordability in developing countries.” Sinovac was the primary vaccine available in the Philippines for about a year until U.S.-made vaccines became more widely available there in early 2022.

Washington’s plan, called Operation Warp Speed, was different. It favored inoculating Americans first, and it placed no restrictions on what pharmaceutical companies could charge developing countries for the remaining vaccines not used by the United States. The deal allowed the companies to “play hardball” with developing countries, forcing them to accept high prices, said Lawrence Gostin, a professor of medicine at Georgetown University who has worked with the World Health Organization.

The deal “sucked most of the supply out of the global market,” Gostin said. “The United States took a very determined America First approach.”

To Washington’s alarm, China’s offers of assistance were tilting the geopolitical playing field across the developing world, including in the Philippines, where the government faced upwards of 100,000 infections in the early months of the pandemic.

The U.S. relationship with Manila had grown tense after the 2016 election of the bombastic Duterte. A staunch critic of the United States, he had threatened to cancel a key pact that allows the U.S. military to maintain legal jurisdiction over American troops stationed in the country.

Duterte said in a July 2020 speech he had made “a plea” to Xi that the Philippines be at the front of the line as China rolled out vaccines. He vowed in the same speech that the Philippines would no longer challenge Beijing’s aggressive expansion in the South China Sea, upending a key security understanding Manila had long held with Washington.

“China is claiming it. We are claiming it. China has the arms, we do not have it.” Duterte said. “So, it is simple as that.”

Days later, China’s foreign minister announced Beijing would grant Duterte’s plea for priority access to the vaccine, as part of a “new highlight in bilateral relations.”

China’s growing influence fueled efforts by U.S. military leaders to launch the secret propaganda operation Reuters uncovered.

“We didn’t do a good job sharing vaccines with partners,” a senior U.S. military officer directly involved in the campaign in Southeast Asia told Reuters. “So what was left to us was to throw shade on China’s.”

Military trumped diplomats

U.S. military leaders feared that China’s COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing, furthering its regional ambitions.

A senior U.S. military commander responsible for Southeast Asia, Special Operations Command Pacific General Jonathan Braga, pressed his bosses in Washington to fight back in the so-called information space, according to three former Pentagon officials.

The commander initially wanted to punch back at Beijing in Southeast Asia. The goal: to ensure the region understood the origin of COVID while promoting skepticism toward what were then still-untested vaccines offered by a country that they said had lied continually since the start of the pandemic.

A spokesperson for Special Operations Command declined to comment.

At least six senior State Department officials responsible for the region objected to this approach. A health crisis was the wrong time to instill fear or anger through a psychological operation, or psyop, they argued during Zoom calls with the Pentagon.

“We’re stooping lower than the Chinese and we should not be doing that,” said a former senior State Department official for the region who fought against the military operation.

While the Pentagon saw Washington’s rapidly diminishing influence in the Philippines as a call to action, the withering partnership led American diplomats to plead for caution.

“The relationship is hanging from a thread,” another former senior U.S. diplomat recounted. “Is this the moment you want to do a psyop in the Philippines? Is it worth the risk?”

In the past, such opposition from the State Department might have proved fatal to the program. Previously in peacetime, the Pentagon needed approval of embassy officials before conducting psychological operations in a country, often hamstringing commanders seeking to quickly respond to Beijing’s messaging, three former Pentagon officials told Reuters.

But in 2019, before COVID surfaced in full force, then-Secretary of Defense Mark Esper signed a secret order that later paved the way for the launch of the U.S. military propaganda campaign. The order elevated the Pentagon’s competition with China and Russia to the priority of active combat, enabling commanders to sidestep the State Department when conducting psyops against those adversaries. The Pentagon spending bill passed by Congress that year also explicitly authorized the military to conduct clandestine influence operations against other countries, even “outside of areas of active hostilities.”

Esper, through a spokesperson, declined to comment. A State Department spokesperson referred questions to the Pentagon.

U.S. propaganda machine

In spring 2020, special-ops commander Braga turned to a cadre of psychological-warfare soldiers and contractors in Tampa to counter Beijing’s COVID efforts. Colleagues say Braga was a longtime advocate of increasing the use of propaganda operations in global competition. In trailers and squat buildings at a facility on Tampa’s MacDill Air Force Base, U.S. military personnel and contractors would use anonymous accounts on X, Facebook and other social media to spread what became an anti-vax message. The facility remains the Pentagon’s clandestine propaganda factory.

Psychological warfare has played a role in U.S. military operations for more than a hundred years, although it has changed in style and substance over time. So-called psyopers were best known following World War II for their supporting role in combat missions across Vietnam, Korea and Kuwait, often dropping leaflets to confuse the enemy or encourage their surrender.

After the al Qaeda attacks of 2001, the United States was fighting a borderless, shadowy enemy, and the Pentagon began to wage a more ambitious kind of psychological combat previously associated only with the CIA. The Pentagon set up front news outlets, paid off prominent local figures, and sometimes funded television soap operas in order to turn local populations against militant groups or Iranian-backed militias, former national security officials told Reuters.

Unlike earlier psyop missions, which sought specific tactical advantage on the battlefield, the post-9/11 operations hoped to create broader change in public opinion across entire regions.

By 2010, the military began using social media tools, leveraging phony accounts to spread messages of sympathetic local voices – themselves often secretly paid by the United States government. As time passed, a growing web of military and intelligence contractors built online news websites to pump U.S.-approved narratives into foreign countries. Today, the military employs a sprawling ecosystem of social media influencers, front groups and covertly placed digital advertisements to influence overseas audiences, according to current and former military officials.

China’s efforts to gain geopolitical clout from the pandemic gave Braga justification to launch the propaganda campaign that Reuters uncovered, sources said.

Pork in the vaccine?

By summer 2020, the military’s propaganda campaign moved into new territory and darker messaging, ultimately drawing the attention of social media executives.

In regions beyond Southeast Asia, senior officers in the U.S. Central Command, which oversees military operations across the Middle East and Central Asia, launched their own version of the COVID psyop, three former military officials told Reuters.

Although the Chinese vaccines were still months from release, controversy roiled the Muslim world over whether the vaccines contained pork gelatin and could be considered “haram,” or forbidden under Islamic law. Sinovac has said that the vaccine was “manufactured free of porcine materials.” Many Islamic religious authorities maintained that even if the vaccines did contain pork gelatin, they were still permissible since the treatments were being used to save human life.

The Pentagon campaign sought to intensify fears about injecting a pig derivative. As part of an internal investigation at X, the social media company used IP addresses and browser data to identify more than 150 phony accounts that were operated from Tampa by U.S. Central Command and its contractors, according to an internal X document reviewed by Reuters.

“Can you trust China, which tries to hide that its vaccine contains pork gelatin and distributes it in Central Asia and other Muslim countries where many people consider such a drug haram?” read an April 2021 tweet sent from a military-controlled account identified by X.

The Pentagon also covertly spread its messages on Facebook and Instagram, alarming executives at parent company Meta who had long been tracking the military accounts, according to former military officials.

One military-created meme targeting Central Asia showed a pig made out of syringes, according to two people who viewed the image. Reuters found similar posts that traced back to U.S. Central Command. One shows a Chinese flag as a curtain separating Muslim women in hijabs and pigs stuck with vaccine syringes. In the center is a man with syringes; on his back is the word “China.” It targeted Central Asia, including Kazakhstan, Kyrgyzstan and Uzbekistan, a country that distributed tens of millions of doses of China’s vaccines and participated in human trials. Translated into English, the X post reads: “China distributes a vaccine made of pork gelatin.”

Facebook executives had first approached the Pentagon in the summer of 2020, warning the military that Facebook workers had easily identified the military’s phony accounts, according to three former U.S. officials and another person familiar with the matter. The government, Facebook argued, was violating Facebook’s policies by operating the bogus accounts and by spreading COVID misinformation.

The military argued that many of its fake accounts were being used for counterterrorism and asked Facebook not to take down the content, according to two people familiar with the exchange. The Pentagon pledged to stop spreading COVID-related propaganda, and some of the accounts continued to remain active on Facebook.

Nonetheless, the anti-vax campaign continued into 2021 as Biden took office.

Angered that military officials had ignored their warning, Facebook officials arranged a Zoom meeting with Biden’s new National Security Council shortly after the inauguration, Reuters learned. The discussion quickly became tense.

“It was terrible,” said a senior administration official describing the reaction after learning of the campaign’s pig-related posts. “I was shocked. The administration was pro-vaccine and our concern was this could affect vaccine hesitancy, especially in developing countries.”

By spring 2021, the National Security Council ordered the military to stop all anti-vaccine messaging. “We were told we needed to be pro-vaccine, pro all vaccines,” said a former senior military officer who helped oversee the program. Even so, Reuters found some anti-vax posts that continued through April and other deceptive COVID-related messaging that extended into that summer. Reuters could not determine why the campaign didn’t end immediately with the NSC’s order. In response to questions from Reuters, the NSC declined to comment.

The senior Defense Department official said that those complaints led to an internal review in late 2021, which uncovered the anti-vaccine operation. The probe also turned up other social and political messaging that was “many, many leagues away” from any acceptable military objective. The official would not elaborate.

The review intensified the following year, the official said, after a group of academic researchers at Stanford University flagged some of the same accounts as pro-Western bots in a public report. The high-level Pentagon review was first reported by the Washington Post. which also reported that the military used fake social media accounts to counter China’s message that COVID came from the United States. But the Post report did not reveal that the program evolved into the anti-vax propaganda campaign uncovered by Reuters.

The senior defense official said the Pentagon has rescinded parts of Esper’s 2019 order that allowed military commanders to bypass the approval of U.S. ambassadors when waging psychological operations. The rules now mandate that military commanders work closely with U.S. diplomats in the country where they seek to have an impact. The policy also restricts psychological operations aimed at “broad population messaging,” such as those used to promote vaccine hesitancy during COVID.

The Pentagon’s audit concluded that the military’s primary contractor handling the campaign, General Dynamics IT, had employed sloppy tradecraft, taking inadequate steps to hide the origin of the fake accounts, said a person with direct knowledge of the review. The review also found that military leaders didn’t maintain enough control over its psyop contractors, the person said.

A spokesperson for General Dynamics IT declined to comment.

Nevertheless, the Pentagon’s clandestine propaganda efforts are set to continue. In an unclassified strategy document last year, top Pentagon generals wrote that the U.S. military could undermine adversaries such as China and Russia using “disinformation spread across social media, false narratives disguised as news, and similar subversive activities [to] weaken societal trust by undermining the foundations of government.”

And in February, the contractor that worked on the anti-vax campaign – General Dynamics IT – won a $493 million contract. Its mission: to continue providing clandestine influence services for the military.

Permalink
June 15, 2024 at 1:13:18 PM GMT+2
Links per page
  • 20
  • 50
  • 100
130 shaares · Shaarli · The personal, minimalist, super-fast, database free, bookmarking service by the Shaarli community · Documentation · Theme : Stack · Font : DINish
Fold Fold all Expand Expand all Are you sure you want to delete this link? Are you sure you want to delete this tag? The personal, minimalist, super-fast, database free, bookmarking service by the Shaarli community