Busting Myths and Misinformation during a Pandemic
Annalisa Coliva, chair of the Department of Philosophy, and Duncan Pritchard, Distinguished Professor of philosophy, sit down to discuss the origins and spread of misinformation as well as tips for discerning what is credible information with Tyrus Miller, dean of the School of Humanities at UC Irvine.
Miller (0:04–1:31): Hello, everyone! Thank you for tuning in to “COVID-19: The Humanities Respond”! I am Tyrus Miller, dean of the School of Humanities, and I’m pleased to be joined by Annalisa Coliva, Chancellor’s Fellow and chair of the Department of Philosophy, and Duncan Pritchard, Distinguished Professor of philosophy. They’re two of the world’s pre-eminent epistemologists. They study epistemology, the philosophical framework of knowledge itself, its conditions in our language and our practice and concepts. And they’re here to talk today about issues about truth, misinformation, error and falsehood as it’s played out very evidently in the COVID-19 pandemic. I’ll start with a question for both of you and maybe Annalisa, we’ll give you the start. There’s a lot of information and misinformation that’s being spread about the pandemic and I’d like to start first with some of the challenges we’re seeing in terms of misinformation and erroneous information, something that might be slightly different which is the idea of, you know, kind of fake information and then also conspiracy theories. Can you tell us a little bit about how you’ve seen that as it’s played out in the current situation?
Annalisa Coliva (1:31–5:09): First of all, thank you for having me, and yes, I do think that it’s important to distinguish between fake news, or unverified information even, and conspiracy theories. So we are bombarded by information at this point in time and obviously with COVID, it’s even worse than before and one huge issue is that to verify the information obviously takes time and we may not have enough time to do our job properly and tell apart the good information from the bad information or misinformation.
Now I think there’s also another issue with COVID-19, namely that many of the events that are the object of this spread of news are actually still unfolding in time. So the first time we get to hear about them, the information hasn’t been verified. We have seen that on a number of occasions. I’ll give you some examples, which are not necessarily badly meant, like for instance, indeed the W.H.O. in the beginning said that the use of face masks wasn’t that important unless you were in a hospital or highly dangerous situations like that. Now we see differently.
With the issue of getting immunized if you contract the virus, again, first, we heard that we would be immunized. Then there were some cases that seemed to go against that. Now, those cases are ranked as false positives so there is a good chance that we are going to be immunized if you get the virus. But then now the issue is for how long? And some people are saying for 12 or 24 months but nobody has been having the COVID-19 earlier than 12 or 24 months so it’s difficult to know. And the same with drugs that are supposed to be efficacious. It takes time to prove that because we need to run clinical trials, and it seems nowadays that Remdesivir is efficacious, but more information needs to come in probably. My suggestion with respect to that is that at this particular moment in time, less is more. So maybe rather than frantically checking the news every day, we might give ourselves a little bit more time, retain in our minds the information that we think may be interesting and then see after a few days whether it has been supported or disproved.
Conspiracy theories are different. Conspiracy theories are not a problem about the quantity of information but a problem about the quality and particularly quality of the explanation. So they are proposing an alternative explanation either in the form of an alternative thesis with respect to what experts or scientists maintain, like flat-earthers. But in the particular case of COVID, we have seen a different account of the origin of the virus. According to conspiracy theorists, it’s actually not natural; it was synthesized in a lab. And I think the interesting issue is why people believe that.
Miller (5:09–5:12): Duncan, do you want to weigh in on this question?
Duncan Pritchard (5:12–7:09): Yes, sure, and again, thanks for having me too — yeah, so I agree with everything Annalisa just said. One thing I think might be useful when we think about this is to think of ourselves both as people who are propagating and receiving information but also we’re generating information ourselves, and this is one thing that is very different about the Information Age. We’re not, it’s not just the external people out there generating information and we’re merely recipients of it. Many of us have news feeds and we’re on Twitter and there’s lots of ways in which we can communicate information. When we deal with the kind of issues that we’re dealing with here, we need to remember there are good actors involved and there are bad actors. Of course, the bad actors, we need to be wary of them. We can be suspicious of their motives and we know from virtually any bad actors that their motives are misaligned with our interests, but it seems to me that the really difficult case actually is with the good actors.
A lot of conspiracy theories are propagated by people who would think of themselves as someone who cares about the truth and so forth. But they’re led to these sorts of views, and indeed what we need is some kind of intellectual hygiene as it were. When we receive information, we need to be the kinds of people who are very circumspect about the information that we receive, that we think about it. We critically evaluate it and we also, of course, by the same token, as people who present information, we need to be careful about the kind of information we in turn propagate. What we’re seeing in the Information Age is these little bubbles of information occurring and it’s not done systematically, a lot of it’s done very unsystematically. It’s done from individual to individual and so forth. We need to inculcate in people the kinds of traits known as the intellectual virtues right. They’re these good traits for thought and inquiry, which enable us to be good recipients of information and also people who are good at propagating information.
Miller (7:09–8:02): That’s a really interesting point and obviously the sort of really malicious, kind of cynical actors are one group but you have a lot of people who are sharing information because they feel like sharing information is a good thing, and they’re perhaps also not aware that they may be having systematic effects way beyond their intentions. And all of us to a greater or lesser extent, even probably the more cool-minded scientists, are wavering between this kind of emotional sense of urgency and this otherwise well-founded sense that we really know pretty little about this still and that’s a kind of, you know, difficult oscillation to maintain an even keel through.
Pritchard (8:02–9:09): I’m just going to say just because it’s kind of topical, I don’t know if you saw in The Guardian yesterday, there was an interview with a biologist talking about misinformation and it struck me as a classic case of how things can get misunderstood if we’re not careful about how to present information. Their headline to the article was “There’s no such thing as objective truth, there’s no such thing as absolute truth, says biologist” and when you read the article, of course as an epistemologist, alarm bells go off, Guardian is telling me there’s no objective truth. When you actually read the article, it’s very interesting. What the person is actually saying is that scientific claims, claims about truth can always be improved upon, which of course that’s absolutely right.
No, they’re fallible, they’re defeasable. One has to be alert. You never take anything as established fact, it’s always open for revision, and so forth. But that nuance will be completely lost. All the people, mostly, won’t read that article. All people will read is a headline from a prominent biologist saying there’s no such thing as objective truth and so then they’ll go away and now they’ll be very suspicious about scientific claims.
Miller (9:09–9:11): It’s all subjective opinion.
Pritchard (9:12-9:43): And astrology and microbiology, whatever, they’re all in a pot. This is a classic case of people not being careful, not just — I mean these are good faith actors — I don’t think anyone, I’m sure the biologist certainly doesn’t believe that. They don’t think what they do is a sham.
I’m sure the journalist doesn’t think that what they do is just a sham, is just subjective. But a lack of care in how things are presented, and that’s going to have, maybe in a small way, but these small effects build up over time and they can get deleterious over time.
Miller (9:43–10:19): Let’s talk a little bit more about the actors in this sharing of either misleading or explicitly false information. What is it that they—first of all who are the people, what kind of type are we talking about when we’re talking about these folks that are sharing this information and what are some of the motives that they have in sharing this kind of information or even in confecting it and propagating it?
Coliva (10:19–14:35): Yeah, well, I think that if we focus specifically on people who tend to believe and then spread out conspiracy theories, then I think we have a pretty good handle on the way they think and work in a sense. First of all, there may be very different ideologies at play or very strong ideologies at play. They start with starting points, assumptions, or as we like to call them here, hinges that may be at odds with the ones that these scientists are operating with, or even people, let’s say with more common sense. And certainly if you had a very strong political ideology of a certain kind, you might want to believe that there is some kind of conspiracy against this country and Western societies in general. That countries, for instance China, will benefit if they produce havoc in our societies and that will help them establish their hegemony. If you start from that way of looking at the world in the most real sense of the word, then you might feel inclined to believe at least this particular kind of conspiracy theory.
There’s also the fact that, very often, people conflate or don’t distinguish— understandably, they are not epistemologists—between pragmatic reasons or what they want to believe, what they hope for, what they wish, what they would like to be true, and what actually is supported by good objective evidence which may not be to their liking at all, so we see a bit of that. Then, there is a very characteristic feature, which is the fact that any counter evidence is blocked off. Either they become impervious to that or they complicate the theory they are believing in order to accommodate this recalcitrant evidence. And that is a clear sign or mark of difference between good and bad theories. That’s really the point. Because to some extent, it’s even part of being a scientist that you’re open to a multiplicity of explanations but then you are open to counter-evidence, or you should be open to counter-evidence, and accept that maybe your favorite theory turned out not to be supported by it.
The way in which conspiracy theorists work, or people that adhere to these theories work, is actually to block off the evidence so this theory becomes a closed system and works like a dogma. These people typically want to have certainty and definitive answers to their questions and they are not open to the possibility of doubt or being wrong, a bit like Duncan was saying, what he was saying before, namely, that good theories, on the other hand in particular scientific theories, are fallible. That doesn’t mean that they are false or up for grabs but they remain open to the possibility of being disproved.
And then they typically have a very strong sense of authority. So one thing that we see here is that if they identify strongly with a certain political power, and that person, or the president of this country actually, if not openly supports but at least acts in accordance with the possibility that these conspiracy theorists might be true, then of course, people who feel strongly about that will be likely to follow the authority. Also, that produces the high sense of loyalty towards the other members of the same group. So all these factors, they play a role in explaining why these theories are believed.
Pritchard (14:35–16:09): One thing I’d add to that is to remember again this good actors, bad actors distinction. Of course, the bad actors spread inaccurate information self-consciously. They spread quite inaccurate information. The good actors also, often including ourselves, can spread out inaccurate information. We don’t do it under that guise as we think we’re spreading accurate information. The question is how do we get ourselves to a position where we do that? What we have to remember here is no matter how careful we are, or how rational we think we are, it’s very easy to get ourselves into a position, where we are behaving in ways that are dogmatic, which aren’t scientific evidence, we’re not appropriately respectful of the truth and everything that goes with that. There’s a whole tradition in philosophy talking about this, goes right back to the Pyrrhonian skeptics. One of my heroes in the early modern period, Montaigne, talks about this in great detail, how hard it is even for very reflective and intelligent people to not drift into dogmatic ideas or dogmatic ways of thinking. And that’s part of the difficulty. The desire to believe certain things can be so strong, these pragmatic factors that Annalisa was talking about, that they can overwhelm us and they can make us fail to see that we don’t have good evidence for the things we say. And so this is why as I say the intellectual virtues, again, an ancient tradition, is very important because it’s about cultivating certain kinds of traits that will help us to be more attentive to the things that matter when we are making judgments of just this kind.
Miller (16:09–17:19): As I understand in the intellectual virtues, there’s also a sense of intellectual virtues with a particular measure but also the potential for those to become problems by being excessive and I think for instance about the faculty of imagination. It is something that allows us to be inventive and to reframe the facts of the world in order to see different aspects but it also can run wild and lead us into a kind of error. I’m recalling a passage which I may not be able to quote exactly but from Wittgenstein, where he says, “Just because I can imagine that an abyss has opened up in front of my front door overnight, it doesn’t mean that I can’t leave my house,” on kind of the evidence that it’s extremely unlikely that that’s going to happen, but it does seem like there’s a kind of possibility of this slide from this mere possibility of imagining to persuading oneself that there’s strong evidence that this is in fact the case.
Pritchard (17:19–18:10): And this is really crucial for the virtues because they lie on a mean between two vices — a vice of excess and a vice of deficiency. And this is really, so think about being conscientious about the evidence. The vice is just not caring, the vice of deficiency. But the vice of excess would be being attentive to things that aren’t relevant. You just become obsessive and pedantic and so forth. What you need is to navigate some way between those two extremes, and it’s very important when we’re dealing with evidence because we don’t want to end up in a situation where by being attentive to evidence, we end up not having any beliefs, we don’t spread any information, we don’t have any conviction. So, there’s this kind of finding this middle ground, where on the one hand, we’re not dogmatic, on the other hand, we do have beliefs; we are willing to put our name to information.
Miller (18:10–18:41): Can we talk a little bit about the effects in the real world of misinformation? This isn’t just a matter of someone either looking foolish or having a subjective error. It’s something that actually produces effects and possibly even fatal ones for some people. What are some of the ways in which you see that kind of effective dimension of misinformation?
Coliva (18:41–19:07): Well, I think that probably the most interesting one is the recent pronouncement made about chloroquine, and then people ended up in hospitals following the advice when they could have just looked at the instructions on these disinfectant bottles more carefully.
Miller (19:07–19:16): That skull and cross bones that’s on the bottle?
Coliva (19:16–20:50): One wonders why and I think the reason here is really that they are devout. They are ready to really suspend any common sense and think that because their political authority is saying so, it must be true. It must be true. In that way, not even the evidence that they get from the bottle of the disinfectant would be strong enough to warn them, not even the sense of danger because you might be crossed by the thought, “Well if I’m going to inject this or drink it or whatever, I’m going to have problems now.” None of that is strong enough or stronger than the faith in the political authority. And another example that I particularly thought was very, very, very troubling, was the fact that Anthony Fauci had to be assigned extra security because he has been threatened through the web just because he was much more cautious than the president in spreading information, like any scientist would be under the circumstances. So again, people who don’t like what is told to them may react so badly against better judgment and the judgment of experts and whatever kind of evidence of the good kind is available to us and so we have to be careful here.
Pritchard (20:50–22:24): Yeah, and I think there’s an even more general worry as well, which is broadly political. It’s that once people start to lose respect for the truth, they don’t care about the truth anymore, then the very idea of any kind of political progress starts to become very problematic. We know that there are certain, it just takes a moment’s reflection to realize there’s a certain kind of political actor that doesn’t care about the truth and they have a political agenda for doing so. But it’s very tempting to respond to those actors in ways where you don’t care about the truth either. You know — they go low, you go low, sort of epistemically speaking. But then if we create a system where no one’s really caring for the truth, everyone is skeptical to the point that they think that everything is up for grabs and so forth, then the whole political project of any kind of reform it seems to me starts to collapse.
If people care, I mean this goes to one really fundamental issue, which is concern for the truth, we tend to think of that as something separate from other kinds of values that we have. I don’t think we can separate it out. I don’t think you can separate out your moral values and your political values and your epistemic values. They all come together, and the ancients understood this. They did not separate, it’s a very contemporary thing to separate these kinds of things out and to think that you can care about the one and not the other. The fact is part of what it is to care about the good is to care about the true and so anything that harms the one harms the other.
Miller (22:24–23:28): That’s a very classically philosophical view. I’m curious about some of the suggestions that you would have about the kind of things that we might arm ourselves with from philosophy that would help us have frames to be able to assess information, to take a proper stance of doubt without falling into a kind of endless spiral of not wanting to accept anything or finding ourselves at a loss because we just throw our hands up in the air about the plethora of information of varying qualities. What would you advise our listeners, our viewers, about, from the point of view of two philosophers who study this question? How might we apply that knowledge to our current situation as we take in information?
Coliva (23:28–25:06): Well, actually, with Duncan, we have created five steps to deal with information, particularly under these unprecedented circumstances. One, probably the first step is to be inquiring. And we can synthesize it by saying that you shouldn’t automatically believe something just because you have been told it, or automatically disbelieve it either. So maybe we should approach information with a little bit of caution anyway and we should ask ourselves whether someone might have a motive for getting us to believe it regardless of whether it is true, and also what the source of this information is and whether it is trustworthy. I think that particularly with COVID, this is extremely relevant, and here there are larger issues having to do with the relationship between populism and the trust in experts. But even more so and simpler than that in a sense, in a country where the political leader really disregards what I’m sure experts have been telling him and goes in front of the country during a press conference and says that it may be a good idea to inject disinfectants and do other various things. Well, he is speaking from a position of authority and that is creating the sense that his words are to be trusted and that creates a huge problem.
Pritchard (25:06–25:29): So that’s step one, be inquiring. Step two, be reflective. You know, is this information plausible? Does it conflict with other things that you know? Do you want it to be true? And that’s a really important question to ask yourself because if you want it to be true, that’s going to have a bearing on whether, you have to accept the fact that we’re kind of, the subjects we are. We don’t believe things that we don’t want to believe.
Miller (25:29–25:35): It’s not just the interest of the speaker but also the interest of the receiver that we have to kind of put under a bit of a lens.
Pritchard (25:35–26:35): Absolutely, I mentioned Montaigne earlier. This is one of the things that Montaigne, his very autobiographical reflections on his own belief processes and about how he ends up being dogmatic in all kinds of ways because of things that he wants to believe. It takes an awful great deal of effort and a great deal of these priorities of intellectual hygiene I mentioned earlier to try and rid yourself of that. You’ve got to be constantly alert of the fact that often we do believe things because we want to believe.
Now the third step is to be conscientious. What evidence do you have of this information being true? Do you have an independent reason for believing beyond the fact that it’s been presented as true? Be responsive. This goes back to the file of books we mentioned earlier which is built into science. If new information emerges that’s relevant, would you be aware of it if it had? Would you change your mind if the evidence changes? This really goes back to wanting it to be true again. Something comes along which gives you evidence to believe what you want to believe and then you close your mind off to any evidence that might indicate otherwise. And then do you want to do the last one?
Coliva (26:35–28:42): And then there is the last step which is be responsible. That goes back to something we mentioned in the beginning. Namely, even if you find a piece of information credible, think before you share it with others. We should ask ourselves a few questions before spreading the word. For instance, is this information up-to-date? Could it be misleading? Could it be misunderstood? This is very important so you know it works, we have to think of communication and the spread of information like elements in a chain. We may be unwillingly spreading misinformation so we have to do our duty and due diligence and be a little bit more cautious before contributing to the spread of bad or good information.
We have to think of this a little bit with how a very simple model presents social distancing. If you have many matches next to one another and the fire gets started, then it will propagate very quickly and it could become a big fire but if you take one match out in between two, then it is more difficult for the virus to spread. With misinformation, it may be a little bit like that. We should be more attentive and pay more attention to what we actually communicate to others. And in general, I do think this is a good thing to keep in mind in general. Rumors get spread that way, urban myths. There’s a lot of literature in social epistemology, which is something that we are studying here at UCI, which is just about that and that’s what we can take as the take-home message from that kind of literature.
Miller (28:42): So a little willingness to kind of step back from the conversation and take some time and reflect. I think it’s understandable, we’re all hungry for answers right now and we’re also pushing out answers that we seem to be able to glean from the information that we’ve gotten. But I think the message of some patience and reflection and deliberation before sharing is also really important, really critical.
I really want to thank you both. This is clearly a topic that is going to be evolving and that we’re gonna be thinking of I think but you’ve given us some tools to be able to face our current situation in maybe a little bit more philosophical vein. So I really want to thank you for this conversation. I’m going to mention that you both have online courses that are free and available, that maybe flesh out some of the other dimensions of things that we’ve touched on, courses that are on skepticism, so the question of, under what circumstances and in what conditions doubt is about a thing or not, and the issue of relativism, the kind of multiple frames of truth and how we adjudicate between them. That may be very interesting for people who are kind of coming into this topic to pursue a bit more deeply with the two of you online. We’ll put the links below this interview and I want to thank everyone for watching and ask you to join us for our next episode of “COVID-19: The Humanities Respond.” Thank you Duncan and thank you Annalisa.
Pritchard and Coliva (30:26–30:27): Thank you.
MOOCS with a message
Science and philosophy have traditionally been regarded as separate disciplines. UCI Distinguished Professor Duncan…
Offered by University of California, Irvine. Relativism is an ancient philosophical doctrine which has recurred time…
Offered by University of California, Irvine. Skepticism is about doubt, and doubt is everywhere in the world around us…