Skepticism and Trust in Science

06 October 2020

Why do so many people believe in conspiracy theories? Do we need to evaluate the evidence for ourselves, or should we just trust the experts? This week on Philosophy Talk, we’re discussing science and skepticism, and the role that trust plays in deciding what's true. 

 

You might think science denialism results from an excess of skepticism. Some people are so skeptical that they refuse to believe even in established scientific fact, like the dangers of COVID-19, the link between HIV and AIDS, the absence of any link between vaccines and autism, the reality of climate change, or the fact that humans have walked on the moon.  

 

But I think science deniers are also not skeptical enough, since they often maintain their disbelief by appealing to bizarre conspiracy theories. Shouldn’t a real skeptic be as skeptical about their own beliefs as they are toward other people’s?

 

In fact, it’s a tricky problem to figure out where you should direct your skepticism. In principle, any data is open to more than one interpretation, and dismissing a study as fake, while it might be pigheaded or perverse, isn’t logically inconsistent. Teaching people facts doesn’t always get them to agree with a scientific consensus; sometimes it just makes them more committed to their own fringe theories. 

 

And it’s true that scientific communities sometimes get things badly wrong. For example, when Ignaz Semmelweis argued that doctors could prevent the spread of disease by washing their hands, he was not just ignored, but ridiculed and forced out of his job. There are also real cases of individuals committing fraud, like Marc Hauser, Michael Lacour, Andrew Wakefield, others whose fraud was eventually discovered… and who knows how many others whose errors will never be found or corrected. With apologies to Winston Churchill, science is the worst form of knowledge creation, except for all those other forms that have been tried from time to time.

 

There’s plenty of value in critically evaluating your own evidence, and taking care to avoid motivated reasoning. But when deciding what’s true, it's not enough to rely on our own virtues; we need to trust others. We need to trust scientists to conduct their research carefully, and report its results truthfully, but it goes beyond that.

 

Most of us are not climate scientists or epidemiologists, and don’t have the time or ability to consider and weigh up all the evidence. Many scientific journal articles are paywalled, and even if you do get your hands on them, you might not be able to understand them. This means that we have to trust scientists and science journalists to correctly summarize the state of the art in most scientific fields. We also have to trust our science educators to teach responsibly, and we have to trust the people in our social circles to share articles responsibly, being alert for misinformation and filtering it out.

 

I think we can all do our part by being trustworthy. Before you share an article, you can look out for signs of misinformation (like science articles that fail to cite their sources, argumentative leaps, claims that seem too good to be true, and anything that can be refuted by a quick Google search). If you share something that turns out to be false, or make a claim that turns out to be false, you can acknowledge the mistake. You can be curious about science, and can even participate in citizen science via sites like iNaturalist, Zooniverse, or citizenscience.gov. These are all ways of cultivating intellectual virtues, while also contributing to the community trust that’s crucial to scientific inquiry. 

 

With all that said, there’s a lot I still don’t know about how to detect, reject, and correct pseudoscience. I’m excited to figure it out on this week’s show with our guest Michael Shermer, founding editor of Skeptic magazine and prolific author.

 

Photo by William Bossen on Unsplash

 

Comments (1)


faywouk@gmail.com's picture

faywouk@gmail.com

Friday, October 9, 2020 -- 8:04 AM

The New York Times had an

The New York Times had an article about a month ago on why people believe conspiracy theories. According to the study they reported on, which had around 2000 participants, 40% of people were disposed to believe and 60% to disbelieve. The personality characteristics of the believers were: entitlement, self-centered impulsivity, cold-heartedness, elevated levels of depressive moods and anxiousness. It's worth bearing this in mind when thinking about how to combat belief in conspiracies.