I heard that if you can hold your breath for 20 seconds without coughing, it means your lungs are healthy.
It's a quick way to check if COVID-19 is in your system. I heard this from my best friend, way back in March when the pandemic had first begun to spread in New York City. I passed this information on to my parents, and we all began to use this technique to cope with the anxiety of catching the virus. That was before I began work on this story and discovered that this strategy was, in fact, a myth.
This theory took hold on a number of social media platforms and spread alongside the disease, as people were eager for a method to detect the early signs of infection. This breath-holding technique is loosely based on the idea that if a COVID-19 patient does not receive medical attention in enough time, their lungs will experience 50 percent fibrosis—a condition that scars the pulmonary tissue and makes it difficult to breathe. Doctors and public health officials have come forward to debunk this idea, revealing that coronavirus does not consistently cause fibrosis, and assert that the only surefire way to check for COVID-19 is through a medically sanctioned diagnostic test.
The falsely lauded breathing test is just one small example of the massive misinformation phenomenon which has accompanied the COVID-19 pandemic. The World Health Organization has classified the issue as a partnering “infodemic”, explaining this as “an over-abundance of information—some accurate and some not—that makes it hard for people to find trustworthy sources and reliable guidance when they need it.”
“Everybody is talking about coronavirus all the time,” explains Dr. Scott Brennen, a scholar and postdoctoral research fellow at the Reuters Institute for the Study of Journalism and the Oxford Internet Institute. The dominance of one subject over the entire global dialogue has created a demand for talking points like never before—be they accurate or not— and as a result, there has been what Brennen calls an “information vacuum”.
Brennen and his colleagues published a fact sheet in April, analysing the types of misinformation populating social media and where it was being sourced. He acknowledged the landscape will have changed since then, but some of the trends are not time-sensitive. The study analysed 225 separate claims that had been classified as false by independent fact-checkers and found that 59 percent of the misinformation was "reconfigured" or "re-contextualised", so they included some form of verified information. A lesser 38 percent of the claims were entirely concocted.
“It’s like that adage—the best lies are the ones that have a little truth in them,” Dr. Brennen explains, “Those things that were re-contextualised rather than completely fabricated were more likely to gain traction.” With such a crowded conversation, especially when it is taking place in the fast-paced world of social media, the onus often falls on individual users to fact-check and cross-reference each claim. That can be an exhaustive and overwhelming task when the data field is so rich, which makes the search for truth more difficult. It becomes much easier to believe things that seem true even if it is unclear whether they are true.
Misinformation can be legitimised not just by shreds of truth but by prominent names that have an already established ethos. Dr. Brennen's study found that while “top-down” misinformation, propagated by public officials or celebrities, comprised only 20 percent of the claims made, it made up 69 percent of the social media engagement. In other words, people are more likely to hit that retweet button when they recognise the name.
Dr. Ramez Kouzy, a clinical research assistant at the University of Texas's MD Cancer Center, worked with colleagues on a study that focused specifically on the spread of COVID-19 misinformation on Twitter. While he is hesitant to say that celebrities have more of an obligation to share the truth, he explains that having a platform comes with a need to share responsibly.
“When you see that blue check [indicating a verified social media account], you feel like 'okay I can probably trust this person. They’re official, they have some credentials, they might know what they’re talking about,'” Kouzy says.
His study is a preliminary look at the scope of the infodemic, which he argues is ever-important given misinformation's tendency to spread like wildfire.
“This thing is happening and it's happening as we speak. We need more research and more people to talk about it because if we don’t talk about it now, we're going to probably miss the train to try to do something about it,” he says.
So what do we do about it?
Well, to start, share responsibly no matter your role. "We all have skin in the game," Kouzy explains, "so we all have to take personal responsibility for the information that we’re sharing on social media or the information that we’re sharing even offline with our friends and with our families."
The cost of COVID-19 misinformation goes well beyond a few falsehoods being spread amongst social groups. Both doctors and patients have spoken up in recent months, aiming to demonstrate the human cost of the "infodemic".
In late March, NPR reported that an Arizona couple had been hospitalised after ingesting a small amount of fish tank cleaner. The cleaner contained chloroquine, a chemical found in hydroxychloroquine, a malaria drug that was touted by the Trump administration early on in the pandemic and has been talked about widely in the news media. The hospital later reported that while the wife had survived, the husband had passed away as the cleaner contained chloroquine phosphate which can be toxic for the human body. The Food and Drug Administration has since warned against the use of hydroxychloroquine by non-medical professionals. Despite the warning, President Trump continued to promote the drug, saying in mid-May that he himself was taking it.
Outside the political realm, independent 'news' organisations also have a lot to gain from the spread of misinformation. A report put out by AVAAZ, an independent advocacy collective, took a deeper look at the "health misinformation spreaders that are behind some of the most prominent websites and superspreader Facebook pages". The report found that the website 'realfarmacy.com' accounted for the most views annually among studied networks. The website, which has published false information about both COVID-19 treatments and the origin of the virus, gains traction from users sharing its articles on a third-party engine like Facebook.
This pattern makes it increasingly important for individual consumers to pay attention to the fact-checking warnings that have begun to appear on social media posts with false information. Dr. Brennen's study indicated a 900 percent increase in English-language fact-checks between the months of January and March. Tech companies are working to verify the accounts of public health officials and partnering with independent fact-checkers to repopulate social media timelines with accurate information. Brennen reports that the academic community feels that fact-checking is, for the most part, an effective strategy in combating misinformation. “Even so, those independent fact-checkers can’t get to every post,” he warns, leaving plenty of room for the wrong information to slip through.
The way that everyday citizens are choosing to engage with medical information is critically important. "It's going to shape how we deal with this at the end of the day and when we come out of it," explains Dr. Kouzy, "Every life lost is a life extra that we shouldn’t lose, and every ounce of research that’s being directed toward that front is very, very important."
To honour that research, it is important to seek it out. Everyday citizens have to join the army of global fact-checkers and treat with incredible care the information they share pertaining to COVID-19. The reason we believe misinformation to be true may have more to do with our vigilance and reasoning capabilities than our political or social biases, new research indicates. A new study coming out of the University of Regina, Saskatchewan, Canada, links higher scores on reasoning tests with the ability to detect "fake news" or misinformation, regardless of political leaning.
Dhruv Ghulati is the CEO of Factmata, an independent fact-checking firm working to make it easier for individuals to sort out the truth from everything else. "What we’re doing is important just because not that many people are doing it," Ghulati said of his company's mission. The idea is to act as a third party agent by cleaning up the way we exchange information digitally. "If you have misinformation in an article, the advertisers won’t like that and their brand won’t like that," he explains, "then the search engines will be unclean, and readers will be misinformed, so it’s all an ecosystem of people inter-operating together."
Ghulati's firm is working to accommodate the ever-changing media landscape. In 10 years, he says, he envisions a world outside the mainstream media in which "we as readers can read about those expert opinions and read content written by individuals and get a wide breadth of opinions but also get the safety of no misinformation, disinformation, hate-speech and so on."
The tools that Factmata offers detect fake news narratives and are marketed to businesses first, where the technology is honed before being offered to individual consumers. Those narratives have taken an increasingly medical tone since the start of the pandemic. "There's a lot of misinformation about vaccines, every day there's a potential remedy or the name of a potential drug that people think could be used. We're also seeing a lot of political propaganda around things like Bill Gates [and] Dr. Fauci," Ghulati reports.
Though Factmata's software offers hope for a more honest digital exchange, Ghulati presses that the onus remains with the individual consumer. "We always have to leave it up to a human to make the judgment," he says, "all we can do is make it easier to sift through the noise and find those nuggets where something is spreading that shouldn’t be spreading."
In short, it's on us. It remains important to acknowledge that the "infodemic" doesn't solely exist on opposite sides of the partisan spectrum or in conspiracy theory chatrooms, and it isn't this dark and foreign force that can only affect the uneducated and ill-informed. I know because I participated in it, without awareness, without malice, my 20-second breathing test serving as an agent of false claims. It went from the lips of my friend, to mine, to my parents, and I can't know its trajectory from there. All that's left is to take personal responsibility—in other words, to know better, so that I can do better.