Over 2 million scientific papers are published every year (R).
Over 4 million blog posts are published every day (R).
Over 5,000 news articles are published in the US every day (R).
Close to 1 million books are published every year (R).
There are a few implications for this kind of future shock, or information overload:
- It makes it harder to gain certainty about any given topic
- It makes it even more important to be adaptable, humble, and willing to learn
- It makes it arrogant to say “do your research”, and assume people who don’t think like them haven’t done research, are sheeple, dumb, or ignorant
- It makes it even more important to use rigor when doing research and seeking truth
Certainty
Information overload makes it difficult to know a truth with absolute certainty. Even Socrates supposedly admitted, “I know that I know nothing”. And that was before the internet, social media, and fake news.
Humility
That doesn’t mean we have no hope of knowing anything. It just means we need to be willing to learn, adapt, and stay humble.
Do your research, analyze all sides, come to your conclusions, but realize you still might be wrong. That’s why statisticians generally go for over 90% statistical significance before drawing any conclusions on their research (which is hard to do).
For the rest of us—especially in the U.S., it seems—we draw our conclusions from far less.
Which begs the epistemological question: How do we know what we know, and when do we know that we know something?
“The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn. ”
—Alvin Toffler
If you don’t remember the last time you changed your mind about something or admitted you were wrong, you’re probably not thinking critically.
“Do your research.”
All of the stats I mentioned about how much research and content is being produced every day, makes it hard to say “do your research”, or call others sheeple, dumb, or ignorant for not knowing something they think they know, or for believing something different than them.
When people say “do your research”, what does that mean exactly in this age of information overload?
With millions of scientific papers, books, news articles, and blog posts being published every day, which “research” are we supposed to be doing, exactly? Should we “do” the line of research that makes us believe as you do?
My point is that there is highly credible research for all sides of any given argument, and that we can’t assume that others who believe differently than us, haven’t done their research, or are sheeple, or dumb, or ignorant. They probably just have followed a different line of research than you.
Further, if we truly do our research—as in, actually research all sides of an argument with an open mind—we will be far less likely to judge others for believing differently than us.
Debate class teaches us to argue for the other side by examining their evidence.
In the same way, true research makes us more understanding and empathetic to how someone can believe differently than us. We start to recognize how someone can be intelligent, and think differently than us. We start to let go of our tribalism and ego, and acknowledge how others have valid viewpoints. Then maybe, just maybe, we can have mutual respect and civil dialogue in this country.
I am a moderate Independent. I lean left on some issues, but am also empathetic to conservative viewpoints too.
But if your identity is too wrapped up in being Republican, you won’t consider liberal viewpoints, or if your identity is too wrapped up in being a progressive Christianity, you won’t consider how orthodox theology might be valid. This is all tribalism (which we’ll talk about more later).
Regardless, if you haven’t researched all sides of an argument (many times there are more than just two sides), then you’re not doing real research, you’re confirming your existing biases.
“The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function.”
—F. Scott Fitzgerald
The hard part is that both sides of any political, religious, or philosophical divide accuses the other of “not thinking critically”.
So how do you know when critical thinking is happening?
I would posit a few ingredients of critical thinking:
- Willingness to be wrong (humility)
- Open-mindedness and willingness to thoughtfully consider all sides
- Probing questions, challenging assumptions, and a healthy amount of skepticism (R, R, R)
I see a lot of conservatives accusing people of blindly accepting what they hear from authorities. And that’s valid. There’s plenty of people that don’t do their own research.
But the problem is when you judge everybody that believes differently than you as being sheeple or ignorant.
Beyond that, many times those who accuse others of being ignorant, don’t do balanced research themselves. They may find dozens of articles that confirm what they believe, but that doesn’t mean they’ve done research. They’ve found information that confirms their existing biases (aptly called, confirmation bias).
This is where saying something like “Do your research”, might be justified.
Too many times I see people (mostly conservatives, sadly) that make statements based on a meme, a shoddy news source, and very little research.
I see them “do research”, but it’s only on one side – the side that confirms what they already believe. It’s usually pretty obvious that they haven’t even attempted basic research into opposing viewpoints.
Just because what you believe is different from what the masses believe (groupthink), doesn’t mean you’re practicing critical thinking or doing rigorous research.
What’s interesting too, is that people who accuse others of blindly following experts and authorities, many times have experts and authorities they are less likely to question.
For example, a conservative may accuse liberals of idolizing science or blindly following CNN, but they themselves might be less likely to question information from Fox News, Glenn Beck, Donald Trump, Breitbart, or any number of other authorities.
Tribalism reinforces our tendency to criticize others, but not be critical of those within our tribe. If you consistently think the same as people within your tribe, you’re still not practicing critical thinking (even if your tribe isn’t the majority tribe).
And then social media in particular, further reinforces those echo chambers of tribalism (you can find a Facebook Group for anything you believe or are interested in, and social media will keep showing you more of what you agree with).
That’s also why unfollowing or blocking others who disagree with us can be so unhealthy. It makes us feel nice and cozy, but our thinking doesn’t get challenged as much. Of course, there are exceptions. Sometimes we need to block someone because they might be toxic, and that’s fine.
—
So I’d like to go deeper on the specific cognitive biases and ways of thinking that make us all prone to error. In such a complex informational environment, it’s more important than ever to learn how to properly pursue truth.
Here we go…
Confirmation Bias
First is confirmation bias. This bias influences us to seek out and believe evidence that confirms our existing beliefs.
Whatever the scenario and whatever bias we are subconsciously confirming, we have to be aware that our biases could be leading us to believe a lie.
Why is that no bueno?
Because believing a lie can cause divisiveness. When our beliefs involve making others into enemies, we’re polarizing this country even more than it already is.
There are two things to keep in mind regarding confirmation bias:
- Data points
- Storytelling
Many of us require very few data points (very little evidence) to confirm our existing biases. It doesn’t take much to keep believing what we already believe.
In situations of deep global uncertainty (or even mild personal uncertainty), all it takes is 1-2 pieces of “evidence” (and sometimes no evidence at all), to assume a lie or jump to a false conclusion.
That makes confirmation bias potentially dangerous – affecting our decisions, judgments and ability to reason.
Then there’s the stories that we tell ourselves.
When we experience uncertainty, it creates cognitive dissonance (which I’ll talk about again later). When we lack evidence to properly navigate an uncertain situation, we tell ourselves stories (make assumptions, jump to conclusions) that fill in the gaps.
Problem is, we still have a crap ton of gaps in the stories we tell ourselves – we just choose to ignore them because that makes us cognitively uncomfortable (cognitive dissonance).
Here’s a great question to ask yourself:
Is this opinion I’m forming true, or am I making up this story to make myself feel better?
Narcissism & Pride
This can play out in a lot of different ways, but generally it’s where people think they are better and smarter than everyone else, and they label others as dumb, call them names, laugh at their opinions, etc.
Besides the fact that these tactics are manipulative (no better way to shut someone up than to make them feel stupid, attack their character, or make them feel crazy), it’s also obvious that these people don’t have the humility to respect other people’s opinions.
For instance, pride and narcissism:
- Shuts you off from hearing new, valuable knowledge from others (because you don’t want to receive knowledge from people that are “below” you)
- Makes you overconfident in your own opinion, so you’re less likely to actually question your assumptions, practice critical thinking or be aware of your own biases
Before you dismiss this section thinking you’re not a prideful person, I’d encourage you to stay with me – you may need this section more than you realize.
Exhibit A: The Dunning-Kruger effect.
This bias could have its own separate section, but I think it fits nicely into the area of pride.
The Dunning-Kruger effect observes that people who are fairly ignorant in a body of knowledge (low competence, low knowledge), are much more likely to be overconfident in their abilities.
This plays out all the time in politics (citizen “intellectuals” who know very little of politics, but assume they know more than everyone else).
Same applies to situations of deep uncertainty – we read a few articles about an issue, and assume we are smarter than everyone else.
Again it goes back to the number of data points and amount of evidence we’ve consumed to form our opinions.
When we have researched very little (or researched a lot but only on one side), it’s hard to be honest with oneself and say: “I don’t know. I don’t have enough evidence to have a concrete opinion yet”.
I feel like saying that in America is like sacrilege – no one wants to have the intellectual honesty to admit they don’t know something.
Fundamental Attribution Error
Fundamental attribution error is when we judge other people’s personality or character for something they did or said, instead of considering their situation or environment.
It’s so easy for us to judge others for their actions, without putting ourselves in their shoes.
But most of us, I dare say, do the best we can in the specific situation we’re in, with the knowledge that we have.
I wager that if you took a hard look at a recent list of ways you’ve messed up, you’d come up valid explanations on how you were doing the best you could.
Yet when we look at others’ mistakes, we don’t give them the same grace or benefit of the doubt, do we?
Recency Effect
The recency effect places more value in our minds on recent events, than past or potential future events. Also makes it hard to prepare for something we know will happen much later in the future (retirement, etc).
Cognitive Dissonance
When we hear an opinion or fact that conflicts with our own beliefs, we experience cognitive dissonance (mental discomfort).
We resort to all kinds of things to relieve this discomfort and experience catharsis:
- Minimizing or dismissing what the other person is saying
- Blaming, criticizing or belittling others
- Anger, denial, and more
Tribalism
Being in a tribe, makes us less likely to affirm how those outside the tribe might have valid viewpoints, especially when it comes to politics or religion.
Another unhealthy way this plays out is in creating a common enemy to the tribe. If “they” are the problem and always to blame, we’re not being honest about our own tribe’s mistakes.
This is also why demagogues are so dangerous. If leaders can create a common enemy, they create control, bigotry, and hatred.
Correlation vs. Causation
As mentioned before, most statisticians require 90-95% statistical significance before feeling certain about any given conclusion.
That’s because it’s so easy to correlate any given spike or drop in data, or any other event, to any number of causes.
We observe an event, and our biases immediately make us jump to the conclusion that it must be because of this one particular reason. But usually, that event might just be correlated to that reason, not actually caused by that reason.
Bad Logic and “Connecting the Dots”
Some thinking can be straight-up bad. Many research a few articles, then extrapolate and deduce a conclusion that fills in the gaps. Sometimes, that kind of deduction is correct. But sometimes, that leads to believing conspiracy theories without real evidence. Either way, we need to recognize when we are making logical leaps, and stay humble about those deductions.
Slippery Slope Fallacy
Quite simply, one result doesn’t automatically lead to another, and to another, and then to another final extreme result, in all cases.
Locus of Control / Victim Mentality
You probably know the story of the baby elephant who was tethered to a post it couldn’t escape. Then when it became an adult elephant, it still believed it couldn’t free itself from the post, even though the full-grown elephant could have easily freed itself.
We can either have an external or internal locus of control. To think we are merely at the mercy of our environment and other external effects, we have an external locus of control, and a victim mentality. E.g., “democrats are the problem”, “immigrants are the problem”, “all cops are the problem”, “all white people are the problem”, etc.
When we have an internal locus of control, we realize that external effects influence our lives, but in the end our behavior, thinking, and words make a bigger impact on our results, in general (there are exceptions).
Intuition & Fast Thinking
We Americans have the tendency to shoot from the hip. We trust ourselves a bit too much, think fast, take action, and form opinions quickly – and we’re vocal about it.
The problem is that fast thinking usually equals bad thinking (without proper study and prior analysis). (R)
Good research analysis takes time, contemplation, consideration, and thoughtfulness. We need to relearn that as a country.
Conclusion
There are many more biases and ways of thinking that we could get into, but I’ll stop there.
To clarify, I’m not saying that we should have this postmodern stance that truth is subjective and can’t be known.
It absolutely can be, and we should seek it!
It is my hope that we become more mindful of how important it is to examine our own biases, do balanced research into all sides of an argument, collect reputable sources, and make well-reasoned arguments with respect and love.