The Limits of Self-KnowledgeOct 06, 2013
Descartes considered the mind to be fully self-transparent; that is, he thought that we need only introspect to know what goes on inside our own minds.
This week’s episode is the first in a new six-part series on the topic of Intellectual Humility. Over the course of the series, we’ll examine intellectual humility in a variety of contexts. How do we humbly disagree with one another? How do our cognitive biases short circuit intellectual humility? Can religious faith and intellectual humility be reconciled? Does science have all the answers? And how can we promote greater intellectual humility in public discourse online?
In this week’s opening episode, “Knowing What We Know …. And What We Don’t Know,” we tackle the foundational question of whether we can know what we do and don’t know, since knowing what you do and don’t know is the first step to true intellectual humility.
Ask yourself which is worse—thinking that you know MORE than you do? Or thinking that you know LESS than you do? How can we avoid both the arrogance of dogmatism and the paralysis of doubt? Clearly, people who are so cocksure they know everything, come across as intellectually arrogant. Who wants to be that person? But is it better to be so lacking in conviction that you never stand your ground?
It’s okay, it seems to me, to be confident, but not so confident that you become dogmatic. It’s also okay to be humble. But you shouldn’t be so humble that you become defeatist. The trick, I think, is strike a balance, to find a middle ground between excessive humility and excessive arrogance. That will enable you to avoid both a rigid dogmatism and a wimpy defeatism. Knowing what you know and what you don’t know is the key to finding that middle way.
The problem is that for most of us, that is much easier said than done. People tend to do everything they can to hold onto their beliefs. We shut ourselves off from opposing points of view. We dismiss sources of evidence that challenge or undermine our beliefs. We surround ourselves with like-minded people. That’s all a recipe for groupthink. That’s why we should seek out dissenting voices. We all need to have even our most firmly held beliefs challenged from time to time.
On the other hand, for almost anything you believe, no matter how well grounded, you can find somebody who not only disagrees, but who will do everything that they can to sow doubt. Think of the climate change deniers. Now you might want to dismiss such people as mere shysters—as paid “merchants of doubt.” And you might be tempted to say that you’ve got no reason to listen to them. But how exactly do you reconcile that thought with the idea that it’s dangerous to cut yourself off from opposing points of view?
I’m not saying that we should necessarily listen to every single skeptic or naysayer, no matter how spurious their arguments. It makes sense to try and separate out genuine grounds for skepticism from spurious grounds for skepticism. But that is, again, something easier said than done. The problem is that almost nothing that we know is truly certain. I believe in climate change, for example, but I am not absolutely certain of it.
Why get hung up on certainty, you might ask? Death, taxes, and the Cubs not winning the World Series, are perhaps the only certainties in life. But even where we lack certainty, we can have well-grounded belief. And well-grounded belief is good enough in most instances.
The problem is that it’s precisely in the gap between well-grounded belief and absolute certainty that doubt lives. Once you admit that you are not certain, you’ve opened the door to doubt—whether it’s from paid merchants of doubt or from sources more to your liking. And it’s not clear that you’re rationally allowed to shut the door on the doubters until they’ve had their full say.
But true doubters never have their “full say.” They never stop doubting. They will tangle you up in interminable arguments, with no end in sight. At some point, you have to cut them off and get on with it. Though that seems right to me, it’s, again, not entirely clear how it’s consistent with not closing yourself off to opposing points of view. I’m tempted to say that you’ve just got to know how to strike the right balance. Too little skepticism, and you become rigid and dogmatic. Too much skepticism, you become a wishy-washy defeatist, who won’t take a stand.
But that raises the question of whether there’s a formula for achieving such balance. It would be nice if there were. But I think there probably is not. The world is sometimes painted in shades of gray, rather than in clear strokes of black and white.
Perhaps by listening to our six-part series, you’ll gain greater skill at navigating between excessive skepticism and excessive dogmatism.
Listen below to our first episode, and then catch all the other episodes as they are released here: https://philosophytalk.org/intellectual-humility
Thursday, March 23, 2017 -- 2:22 AMI enjoyed this post greatly,
I enjoyed this post greatly, but I do wonder how it would come off to the person it is trying to "reform." This is a perennial question I have for philosophical positions that may come off as proscriptive. The person in need of this advice appears to be someone either excessively humble or excessively arrogant. How would either person respond to the statement, "The trick, I think, is strike a balance, to find a middle ground between excessive humility and excessive arrogance"? What a hopeless encounter this ends up looking like! On the one hand, the excessively humble would probably be too humble or skeptical to think they have enough evidence for this claim. On the other hand, the excessively arrogant person would probably be arrogant enough to think that they've already struck this balance! What is a philosopher to do...
Thursday, March 30, 2017 -- 10:00 AMThe ignorant elephant in the
The ignorant elephant in the room: "When the President Is Ignorant of His Own Ignorance"
Thursday, September 26, 2019 -- 3:46 AMLots if people like to say
Lots if people like to say "fact" and "law of science" but these are often either codification errors or outright common fallacies. Take for instance that there is no absolute "1". What we take as a given when conceptualising reality is often and plainly not the real case.
Ex: You have one pencil in your your pocket but is it really one pencil? Snap it in half, sharpen all the briken ends and now you have three pencils. Just how exactly can you have 3 from one?
Or lets take the widely excepted "race card."
Ex: "race" has no starting gun and no finish line, why equivocate "family" as a "race"?
Or lets take a moment to consider gender
A man is not a woman and a woman is not a man, circular deffinition. You could try "man" does xyz while "woman" does abc. But on the next island, "woman" does xyz while "man" does abc. So now male and female are determined by geographical regions? They never stop to think "man" /= "male" or "female" and "woman" /= "famale" or "male". Gender is a subjective concept and phenotypical dimorphics exists independently of our belief or values.
The only real facts are universal, meaning, you can consider them from any standpoint and result in a true fact.
Ive only ever found two of these.
Every rule has an exeption.
And the exeption to it:)
"Everything is a poison. Its simply a matter of dosage." ~forgot exactly who, probably me in the first place.