HomeArticlesThe fine line between wrong and almost right — and how that...

The fine line between wrong and almost right — and how that plays out during pandemics

  • In January 2021, French President Emmanuel Macron announced to journalists in his country that one of the COVID-19 jabs that France used wasn’t working as it should.
  • Simultaneously, across the Atlantic, in the United States, former president Donald Trump made downright false statements about unfounded COVID treatments.
  • How should journalists cover such comments — should they treat Macron and Trump’s statements differently?
  • We take a deep dive into strategies to fight misinformation during times of crises.


In January 2021, French President Emmanuel Macron announced to journalists in his country that the AstraZeneca COVID-19 vaccine was “quasi-ineffective” in people over the age of 65. By “quasi-ineffective” he meant the vaccine wasn’t working the way it was supposed to. 

But it wasn’t entirely true. 

From a scientific view, the jab worked slightly less well in older adults than in younger people, but it still protected people over 65 against falling very ill with or dying of COVID

This left journalists in a precarious position. 

At the time, thousands of people, around the world, were dying from COVID-19 each week. Reporting on the president’s doubts about the vaccine could lead to people rejecting lifesaving medicine in the midst of already growing vaccine hesitancy.

But when a president makes a statement, it’s news. And Macron’s announcement came at a time when both scientists and the public followed every single detail of COVID vaccines. 

So what was the right thing to do — bury the story or report it? 

At a Pandemic Preparedness Policy Summit in April convened by the Rhodes Trust based at Oxford University, we had the chance to ask some of the world’s leading health journalists how they approached this dual role of the media. 

How do they balance reporting in a way that is scientifically accurate and helps people to make sensible choices for their health, but, at the same time, also meet the news demands of their consumers? 

And spoiler alert — they didn’t agree on what the best approach was.

Misinformation can be just as dangerous as a disease and harder to control

The Global Health Security Index measures how prepared a country is to deal with a new disease outbreak against a set of benchmarks. They go beyond the usual things like alert systems and vaccination infrastructure by also including how well a country can counter misinformation and communicate risk. 

A recent addition to their survey after the COVID-19 pandemic is a question that asks: “Is there evidence that senior leaders (presidents or ministers) have shared misinformation or disinformation on infectious diseases in the past two years?”

In the United States, for instance, former president Donald Trump made frequent false statements about unfounded COVID treatments such as the antimalarial drug hydroxychloroquine and antiparasitic medicine ivermectin

But would Macron’s comment fall into this category too?

Hours after the French president questioned how well the AstraZeneca vaccine worked, the country’s health minister clarified that the jab had “remarkable efficacy”. But this statement was made in a world already filled with anti-vaccine disinformation, preying on people’s worst fears. 

Macron’s initial statement was, unfortunately, enough to cast doubt in many people’s minds. 

The ensuing reluctance to roll out the AstraZeneca shot in the early months of 2021 likely cost thousands of . The science was there, but the trust wasn’t.

Here’s how we think countries should prepare to fight misinformation during pandemics.

1. Fight fire with fire: Use the same channels and methods as those who spread misinformation

A 2022 study published in Nature Psychology shows people are more likely to believe misinformation when it comes from people they trust — for example, from sources claiming to be leading researchers or doctors. 

Other characteristics that make the wrong information travel faster are if the statements appeal to people’s emotions, align with their political views and are repeated often.

In South Africa, for instance, given Russia’s support for South Africa during the apartheid struggle, some political groups were vulnerable to concerted efforts by that country to create positive public perceptions of their Sputnik V vaccine. The messages had deliberate components that made them enticing: 69% of coverage in traditional media was centred around Russian President Vladimir Putin (evoking a trusted source) describing it as “safe and effective” and sharing how his adult daughter received the vaccine (adding an emotive component). It was also released as a breaking news story, which meant it was covered widely and was repeated often. But in reality, there were serious concerns about the lack of clinical trial data for this vaccine. (In October 2021, South Africa’s medicines regulator, Sahpra, rejected the application for the jab being approved for emergency use).

At the Pandemic Policy Summit, journalists suggested fire should be fought with fire. So, trusted sources, such as credible scientists, doctors and activists, should be quoted in stories that counter misinformation, and those stories should be circulated on the same channels (read: social media) that people who spread misinformation use. 

Journalists also felt strongly that science in such cases needs to be presented in a way ordinary people can identify with it. Bhekisisa, for example, used the analogy of a hamburger recipe to explain the rigorous scientific processes vaccines need to undergo before they can be said to be safe and effective. Producing the same message in different formats, in other words, repurposing content, is important to ensure repetition and the message then also reaches a broader audience. For instance, an in-depth print story about vaccine side effects could be reproduced as a short video which focuses on a side effect of only one vaccine (and appeals to a younger audience).

 2. Embrace fact checkers to fight misinformation on social media

Social media makes false information spread faster — even if the data comes from fake news websites which, according to research of the Reuters Institute for the Study of Journalism, have lower readerships than legitimate media organisations where news goes through extensive fact checking processes. False news sites are punted as news publications, but are in fact established for a different purpose, such as to sway people in favour of certain political or ideological views. 

In France, for instance, one false news outlet generated an average of over 11-million interactions on Facebook per month — five times greater than more established news brands in 2017.

Having fact checking systems in place to counter the impact of false news sites on social media is important.

One example that worked fairly well during the COVID-19 pandemic, was the World Health Organisation’s (WHO) collaboration with large social media and tech companies to limit false news being spread around. The WHO and Google, for example, worked together to rank search results according to whether they were released by a credible source, instead of using Google’s standard algorithm. 

The Facebook COVID-19 information centre was set up with the WHO. Between January and April 2020, Facebook directed over 2-billion people to WHO resources, which led to over 350-million clicks

But it’s not just international health bodies who can jumpstart such verifying processes. Over the past decade, credible media fact checking organisations have been established and they’ve had lifesaving roles during COVID. They should be used right from the start in the next pandemic. 

The Taiwan FactCheck Centre debunked false claims, such as one going around advising that drinking green tea and lemon could cure COVID, which helped to stop people from relying on unproven treatments. Closer to home, Africa Check similarly disproved myths and clarified misleading stories. For example, in response to a rumour about the AstraZeneca vaccine containing a harmful chimpanzee virus, they clarified that it only contained a single, non-infectious and genetically modified component of a chimpanzee virus.

 3. Teach people how to spot red flags

Misinformation often plays on people’s emotions and uses words and phrases such as “deception”, “wake up” or “warn your family and friends”. 

Part of helping people to cut through fluff is to up their basic knowledge about health tools such as vaccines before a pandemic hits — and to make sure everyone has equal access to it. 

One example of how to do this is to use childhood vaccinations or jabs for outbreaks such as cholera, to break down how and why vaccines work so that the science behind these medicines isn’t entirely new when a health crisis arrives. 

During COVID, vaccine-hesitant people, for example, often argued that they don’t believe COVID jabs worked because boosters were needed. Yet, for almost all childhood jabs you need such top-ups too. Had parents been told before what a booster is and that one jab alone is not enough, this hesitance during COVID could have been prevented. During the coronavirus crisis, the New York City health department considered misinformation to have such a harmful impact that it established a misinformation response unit to monitor such messages. This unit worked with over 100 community organisations to create easy-to-understand, culturally appropriate messages for different groups of people. As a result, officials could quickly pick up messages with inaccurate COVID-19 information, particularly about vaccines and treatment, and then send out notes with the correct information.

4. Invest in the personal and online security of journalists and scientists

Countering misinformation on social media can make those responding to it targets for online abuse. During the pandemic, Bhekisisa journalists received personal death and assault threats. Similar trends were noted for prominent policymakers and scientists. WHO director general, Tedros Ghebreyesus, and the US government’s chief medical advisor during COVID, Anthony Fauci, became recurring villains in online disinformation campaigns

Countering misinformation should not come at the cost of personal safety. Organisations must recognise that protecting the safety of their employees requires a proactive strategy to deal with abuse, and mental health resources they can tap into when it does happen. The International Women’s Media Foundation trained Bhekisisa journalists in online security and helped the organisation to develop a policy around it. The centre used the South African Depression and Anxiety Group’s free counselling for journalists.

Helpful now, helpful in the future

As we prepare for a future in which pandemics will likely be more frequent, we need to think beyond getting ready to track, treat or test. 

How journalists and policymakers communicate these pandemic tools can make or break their uptake. At the Rhodes Summit one journalist said “consistency is the currency of trust”. Building trustworthy health information sources that can convey not just accurate but also meaningful news, will be invaluable for a future pandemic.

Nina Acharya is a DPhil candidate studying the political economy of global health at the department for social policy and intervention at the University of Oxford.

Helene-Mari van der Westhuizen is a South African medical doctor, co-founder of the NGO TB Proof and researcher at Oxford University.

Mia Malan is the founder and editor-in-chief of Bhekisisa. She has worked in newsrooms in Johannesburg, Nairobi and Washington, DC, winning more than 30 awards for her radio, print and television work.

Dave Chokshi is a physician at Bellevue Hospital in New York and Sternberg family professor of leadership at the City University of New York, and a former health commissioner of New York City.