Disclosure: This article may contain affiliate links which means that, at no additional cost to you, I may earn a commission if you click through the links and make a purchase.
“We are drowning in information, while starving for wisdom. The world henceforth will be run by synthesizers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.” ― E.O. Wilson
Fake news. Misinformation. Conflicting information. Conspiracy theories. Fact checking. Pseudoscience. Bad logic. Biases. Agendas. Critics. Censorship. Bad advice. Celebrities and experts. Who to trust? What’s right? What’s wrong? How can we stay informed and have an accurate view of reality? How can we be more right than we are wrong?
We’re in a midst of a global health crisis, economic uncertainty, and political turmoil. This is a recipe for confusion, distrust and misinformation.
This year has revealed the dangers of a collective lack of critical thinking. During a pandemic especially, misinformation and a lack of critical thinking can be deadly.
Misinformation can spread just as quickly as the coronavirus which contributes to more cases.
Critical thinking is important now more than ever.
Not only does learning critical thinking help us see and understand the world more accurately and get closer to the truth, it also improves our learning and decision making skills leading to higher life satisfaction in all areas of life – better health, happiness and success. Studies suggest that awareness of your own biases can even improve happiness in marriages.
Looking at Ourselves
When it comes to critical thinking, it’s important to be aware of our own limitations and cognitive flaws.
Critical thinking and skepticism is not about doubting and challenging everyone and everything else. It’s about doubting yourself. It starts not with being critical of others, but with ourselves.
We like to think of ourselves as rational, logical and reasonable beings. We tend to underestimate how easily fooled we can be and we overestimate our abilities and knowledge.
The truth is that we are very emotionally driven. Our brains are lazy. They like to take shortcuts and use the least amount of energy as possible. We are actually more emotionally comfortable the less amount of thinking we have to do.
Everyone is vulnerable to misinformation. Awareness of our vulnerabilities is the first step in critical thinking.
“No matter how smart or well-educated you are, you can be deceived.” ― James Randi
Learning about our cognitive biases, heuristics and logical fallacies can help us learn better and think more critically.
Cognitive biases are flaws in our thinking. Heuristics are mental shortcuts that can lead to biased thinking. Logical fallacies are logical flaws in arguments. Cognitive distortions are exaggerated or irrational thoughts that distort and misrepresent reality.
The Fallibility of Memory
Our memories are incredibly flawed. In fact, memory is mostly fiction. Every time we recall a memory, we’re actually reconstructing and updating it. We’re remembering the memory from the last time we recalled it. This is called the misinformation effect.
We can often combine the details of different memories, mixing them up or combining them all together. This can also happen with memories of our dreams and fictional media we consumed such as a movie we watched. We also tend to make stuff up. This is called confabulation. Our brains do this automatically to fill in gaps so that our memories are continuous.
We can also personalize someone else’s memory and remember it as happening to ourselves. If we’re discussing a collective memory with another person or a group, we can also take in the details of their memory into our own. The details of a memory can get distorted and change over time as well.
Memories can also be implanted, either accidentally or on purpose. It’s possible that we have false memories from our parents misremembering something that actually happened to our sibling and not us.
Despite the lack of accuracy of our memories, we tend to still remain confident about them.
The Fallibility of Perception
We like to think that our view of the world is accurate. The truth is that it’s incredibly limited and flawed.
Our perceptions have blind spots and can have discrepancies from reality. If this weren’t the case, optical illusions wouldn’t exist. Magic wouldn’t exist either if our perceptions were perfectly accurate to reality.
Attentional (or inattentional) blindness is a phenomenon that shows that we often don’t see what’s right in front of our eyes just because we’re not paying attention to it.
Hallucinations are also a lot more common than we think. Just as we often mishear what someone said, we can see things inaccurately and have “glitches” in our perception.
These glitches happen all the time. Our brain tends to construct what we think we saw or heard to fill in the gaps or connect the dots even if it’s inaccurate to reality. Our brains like to tell us stories even if those stories don’t represent reality perfectly accurately.
Many factors come into play with our perception as well such as being sleepy or poor conditions such as low lighting.
Also consider the fact that our perceptions become memories which are also flawed.
These are reasons why eyewitness testimony tends to be unreliable.
Examine Your Beliefs
Pay attention to what your beliefs are and where you got them from. Take time to evaluate and examine them. Notice and realize what beliefs you’re emotionally attached to and what beliefs are tied to your identity.
The most important beliefs to evaluate are our own. Awareness is important and it’s the first step for any positive change, especially learning critical thinking and skepticism.
Keep in mind that most of the beliefs we have and the decisions we made are not through objective reasoning of facts and probability. The beliefs we have tend to be influenced from the information we came across first and what gave us the strongest emotional reactions.
When we are learning, we want to avoid attempting to prove ourselves right and instead seek truth and accuracy.
We want to avoid engaging in motivated reasoning which is when we do whatever we can to justify and defend our beliefs at all costs. Motivated reasoning is likely triggered by cognitive dissonance which is psychological discomfort we experience when facing information that contradicts a belief we hold.
This makes learning, unlearning and changing our beliefs to a more accurate representation of reality difficult. The more strongly and emotionally held beliefs are, the harder they are to change even with disconfirming facts and evidence.
We also tend to start with conclusions that we desire and then reverse engineer our learning process to support them. This is called rationalization.
Consider that you hold false beliefs, likely many, and that you hold opinions on things that you know very little about.
“An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.” ― David Dunning
“The problem with the world is that the intelligent people are full of doubts, while the stupid ones are full of confidence.” — Charles Bukowski
“The fundamental cause of trouble in the world today is that the stupid are cocksure while the intelligent are full of doubt” — Bertrand Russell
“Ignorance more frequently begets confidence than does knowledge.” — Charles Darwin
“The greatest enemy of knowledge is not ignorance—it is the illusion of knowledge.” — Daniel J. Boorstin
The Dunning-Kruger effect is a cognitive bias in which people with low ability and knowledge tend to overestimate their knowledge and abilities.
The Dunning-Kruger effect is not something that only applies to dumb people. It applies to everyone. We are all incompetent at not just something, but many things.
To avoid the Dunning-Kruger effect, humility is essential. Realize that you may not actually know as much as you think you do especially if you lack extensive expertise, education and experience in something.
The more humble we are in our knowledge and abilities, the more likely we are to continue learning and gain experience and expertise and avoid the peak of the Dunning-Kruger effect known as Mount Stupid.
Don’t just rely on one podcast or YouTube video or article. Read actual books. Seek out and listen to the experts rather than giving too much merit to your uninformed opinion.
Continue learning and be open to it. This is also a way to avoid the Dunning-Kruger effect. Realize that unlearning can also be a part of the process. You may have learned the wrong information or the facts may have changed. It’s important to continue updating your knowledge and beliefs.
Practice self-awareness in your knowledge and abilities. Don’t underestimate what you don’t know. Remember that you don’t know what you don’t know either.
Doubt is a sign of intelligence. True intelligence is a balance of curiosity, being open to learn and skepticism.
Most of us don’t enjoy realizing we’re wrong. It can be painful and uncomfortable. Being wrong, however, is a big part of learning and growth. Science evolves and advances through testing and proving wrong.
The ability to embrace being wrong allows us to be open to new information and update our beliefs to more accurately match reality and truth.
Value truth over being right or confirming your beliefs. Instead of seeking to be proven right, seek to know and be willing to be wrong.
Try not to identify with your beliefs. We are less likely to change our beliefs even when faced with facts and evidence when our beliefs are tied to our identity and sense of self.
Also, take pride and pleasure in being able to change your beliefs and information. Don’t view it as a weakness or bad character that you were once wrong or that you’ve changed your mind.
Confirmation bias is the tendency to seek out and only remember information that confirms our beliefs, and reject information that contradicts them.
Most people don’t want to know. They want to believe. It’s more comfortable to continue to believe what we already believe especially if these beliefs are tied to our identity.
To avoid confirmation bias, seek out information to disprove your initial beliefs. This is what scientists do. They attempt to disprove hypotheses rather than prove them.
We tend to stick to the first piece of information we receive. This is why many of us tend to have the same political and religious beliefs as our parents.
The first thing we learn influences how we learn and make decisions in regards to that information and topic. We like to think that we think independently, but we are incredibly influenced by many factors. One of them being who and what information got to us first.
Take into account that what you may have learned first may not be true or accurate and that it’s influencing your thinking and decisions going forward.
We also tend to remember best the information we were exposed to first which is called the primacy effect.
Illusion of Explanatory Depth
We tend to think we know how the world works. The illusion of explanatory depth shows that we tend to think we know how things work, even when we don’t. We overestimate our understanding of complex technology we use everyday such as a toilet or a bicycle.
We tend to formulate an opinion even with very little information.
A study by Dan Kahan found that when people who knew nothing about nanotechnology and had no opinion on it were given two vague sentences describing the complex and esoteric technology, nine out of ten were willing to offer strong opinions.
A little bit of information, even when it is barely informative, can make us strongly opinionated.
Again, it’s important to practice humility and self-awareness when it comes to what we think we know and understand.
“Perhaps we should occasionally stop and say to ourselves, ‘You know, maybe I have absolutely no idea what I’m talking about.” ― Emma Jane & Chris Fleming
We should also engage in learning more about something before formulating an opinion on something, especially if we don’t know much.
“A little learning is a dangerous thing; drink deep, or taste not the Pierian Spring; there shallow draughts intoxicate the brain; and drinking largely sobers us again.” ― Alexander Pope
Illusory Truth Effect
We are more likely to believe something when we hear or see it repeatedly, frequently and consistently regardless of whether it is true or false.
Marketers take advantage of this with constant repeated advertisements.
Knowledge does not protect against illusory truth. Studies show that even for claims that people already knew were false, repeated exposure to the false claim increased belief in it.
Repetition of falsehoods decreases certainty even for things we know for a fact.
We also tend to like and agree with what we’re already familiar with.
The Halo Effect
We unfortunately do not live in an informed society. We live in an influenced society that idolizes celebrities and social media influencers.
Marketers are well aware of the halo effect and take advantage of this with celebrity endorsements and influencer marketing.
The halo effect is a cognitive bias in which we tend to let one admirable trait about someone influence our beliefs about their other traits.
This causes us to overvalue the opinions and advice of people just because they are wealthy, famous, successful, etc. We even tend to perceive attractive people as smarter, funnier and more likable than less attractive people.
We may also think that if someone is successful in one area, that they are successful or skilled in other areas.
The halo effect can lead us to make the mistake of thinking that just because someone is successful, educated or knowledgeable in one area that we should take their advice in other areas.
The halo effect is common in the self-help, business and wellness communities. We think that if someone is successful in something, if we copy everything they do, then we can experience the same results that they can.
We will become rich like them. We will be successful like them. We will become fit and healthy and look like them.
This type of thinking underestimates chance and other factors that come into play, but we don’t consider or know about.
To avoid the halo effect, please don’t take health, nutrition and medical advice from celebrities and influencers first and foremost.
Take the advice of any ‘celebrity expert’ with a grain of salt. This goes for health, money/business, and dating. Consider where their advice is coming from. Is it advice from them specifically as an expert/authority or because it worked for them? Or is it advice that is backed by science, evidence and research. I tend to look for and pay more attention to the latter and I encourage you to do so also.
Just because something is popular or endorsed by your favorite celeb(s) doesn’t mean it’s good. It could even be harmful.
Remember someone can be intelligent, successful, confident, wealthy and famous and still be wrong, lack critical thinking, have flawed logic, make bad decisions, etc.
The Halo Effect can lead to poor logic known as argument from authority.
Argument from Authority
Argument from authority is a claim of something to be true or valid just because an authority figure supports that claim. Types of argument from authority can include authority of popular opinion or appeal to celebrity or an implied authority.
There are confusions and misconceptions on the argument from authority logical fallacy.
Keep in mind that it is completely valid and legitimate to consider the expertise and experience of an individual when examining their claim.
Claiming solid scientific consensus is an argument from authority is a misuse of the logical fallacy. Solid scientific consensus is built upon a large amount of evidence, often continuously examined and argued for a long period of time.
When it comes to areas that are out of our expertise, we should trust those who have education, experience, expertise and authority in that particular area. Even in areas of our expertise, we still look to those who know more than we do or who are more advanced and skilled than we are.
Oftentimes we don’t have the expertise, experience, and resources to properly evaluate evidence and research.
Expertise, education, authority, knowledge should be taken into account and taken seriously. The data and evidence behind the expertise is what is important.
Reliance on Gut Feelings
Our gut is often wrong. Our gut or our intuition is just feelings. They may not necessarily be accurate or helpful. Our feelings, intuition and gut helps us act quickly and make decisions, but this kind of thinking can contribute to lazy and quick thinking. Daniel Kahneman, Nobel prize winner for his research in thinking calls this kind of thinking System 1. System 2 is the more deliberate and slow kind of thinking.
It’s helpful when we are in danger and have to make quick decisions, but we should not rely on it when more deliberate thinking is necessary or when we lack expertise. Gleb Tsipursky claims that businesses make the most and biggest mistakes when making decisions that rely on gut feelings.
The affect heuristic shows that we tend to make poor decisions and ignore odds because of our gut feelings.
Logic does not come to us quickly and it definitely does not come to us intuitively. Our intuitive mind is terrible at math and tends to simplify. Remember that your feelings are not stronger than data and evidence.
Instead of relying on our feelings and quick intuitive judgments, it’s important to take the time to slow down our thinking into more deliberate and thoughtful evaluation.
Avoid reacting especially when information or news is emotionally triggering. Your instinct may be to share. Take the time to stop and evaluate the information first. Misinformation spreads quickly especially when it triggers strong emotional reactions such as outrage. We’ll cover how to detect misinformation and fake news.
The Overconfidence Effect
“The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct.” ― Daniel Kahneman
Studies suggest that overconfidence leads to more inaccuracy and mistakes. Confidence can stem from intuitive thinking which clouds our better judgement and leads to poor decisions and junk learning. The more confident and certain we feel we are, the more wrong we tend to be.
The more confident we are, the more we think we can trust ourselves. The same goes for when we notice other confident individuals.
Overconfidence can lead to the halo effect when we listen to misinformation, bad advice or pseudoscience just because it’s delivered confidently or because the speaker appears to be a confident individual or they state that they are incredibly confident in their argument (and that sometimes tends to be their evidence behind their argument). Overconfidence can actually be contagious and lead to groupthink as well.
As shown in the Dunning-Kruger effect, confidence is not a reliable or telling measure of our knowledge or skills. The more confident we feel, the more skeptical we should be.
Instead of just going off of confidence or a feeling, look for strong, factual evidence to support claims. Practice self-awareness about your abilities, knowledge, education, expertise and skills.
False Equivalence & Equality Bias
Studies show that we weigh opinions equally regardless of expertise and experience. We may think we are practicing critical thinking and being open minded when we think or say that different viewpoints should be given equal merit.
We also tend to do this in our own discussions when we are more competent and knowledgeable than the other person. This is called the equality bias.
Researchers have found that those who are less capable gave more merit to their own views than those of the more capable person. Meanwhile, those who were more capable were more accepting of the less capable person’s views even when they were clearly wrong.
False equivalence is when we claim that two opposing arguments are equally equivalent when they are not. We’ll touch on this more further when it comes to how the media can play a role in this bad logic.
Seeing “both sides” of an argument as equal when they are in fact not points to a lack of critical thinking and flawed logic. This can lead to misinformation and pseudoscience gaining more strength when we give it validity for the sake of trying to be fair and polite and thinking that we’re practicing critical thinking.
If two people come to different conclusions about a fact, then either one or both must be wrong. Both people cannot be right.
Not all information or opinions are equal. Information, advice and opinions from experts should be given more weight than those from non-experts. Arguments, information, advice and opinions backed by facts and evidence should also be given more weight.
When we don’t know the answer or can’t be certain about something, it’s always better to weigh out the pros/cons and gains/losses and err on the safe side of two possible options. Even when we have our doubts, it’s always better to take more precautions to minimize risk. At worst, we are wrong and overreacting.
Here are some relevant examples of Pascal’s wager.
The normalcy bias shows that during a crisis, we actually tend to be abnormally calm and pretend everything is normal.
Research on crises and catastrophic events shows that the people who survive are those who are quick to take action and adapt to change. It’s better to overreact than to underreact. Prepare for the worst.
“Extraordinary claims require extraordinary evidence.” ― Carl Sagan
Occam’s razor is a fundamental rule of thumb which states that the explanation with the least amount of assumptions is usually the right one. In other words, we should avoid making unnecessary assumptions. The more we assume, the less likely something is to be true.
People often make the mistake of thinking that applying Occam’s razor means that the more conceptually simple hypothesis is the one that is true. A more complex explanation may be the simplest in terms of less assumptions made and the most accurate.
An anecdote is a story or personal experience that is used as evidence. Anecdotes are considered the weakest form of evidence.
“Anecdotal evidence leads us to conclusions that we wish to be true, not conclusions that are actually true.” ― Barry Beyerstein
Anecdotal evidence is common especially for pseudoscience and quackery. “It worked for me so it should work for everyone.”
Anecdotal evidence is unreliable because it depends on one person’s (or a few people’s) experience, usually against scientific evidence. As explained above our perceptions, thinking and memories are incredibly flawed.
Anecdotal evidence also fails to consider other factors. Let’s say we use a skin cream and our acne clears up. We fail to consider other factors that may have contributed or caused the clear skin.
Anecdotal evidence can lead to the Post Hoc Ergo Propter Hoc which means that A was before B, therefore A caused B. Correlation does not prove causation.
Our brains naturally pay attention to anecdotal evidence rather than the actual science, math, data, or statistics. We prefer stories over cold, hard numbers. We are more likely to remember and be influenced by the story of a shark attack than by the statistics showing the unlikeliness and rarity of shark attacks.
Look for data and strong scientific evidence such as peer-reviewed material. Avoid peer reviewed evidence and take your own evidence with a grain of salt.
The Just-World Fallacy
This is something that can be seen in self-help, but more so on the spiritual side. This is harmful as it can turn into victim blaming very quickly. It also causes people to dangerously underestimate risk. “That could never happen to me.”
The just-world fallacy is the idea that bad things don’t happen to good people or smart and educated people. That people are deserving of what happens to them.
We like to believe that the world is fair and just because it makes us feel comfortable and safe. This is why belief in karmic justice is so prevalent.
It’s important to not underestimate chance, factors that are out of our control, and what Nassim Taleb coined black swans which are unlikely, unexpected and unpredictable but still plausible events with extreme consequences.
We even like to believe that we have control over outcomes that are random or too complex to predict which is called the illusion of control.
Bad things happen to good people. Good things happen to those who we may deem undeserving. The world is not fair and there is only so much in our control.
Now, let’s take a look of a big catalyst of misinformation which can illustrate poor logic in many forms – conspiracy theories.
Our brains are wired to see patterns. They’re also naturally inclined to look for and find meaning. Conspiracy thinking, however, takes this cognition to the extreme.
(I should note that when discussing conspiracy theories, I am referring to grand conspiracy theories. Technically speaking, conspiracy theories are theories about conspiracies. Small scale conspiracies most definitely occur all the time. No one doubts that. Grand conspiracy theories are theories that say that the state of the world or nation is due to secret actions of a small but powerful and evil group (the government, a corporation, a medical group, scientists, a religious or ethnic group, etc.))
Conspiracy theories become especially popular and common during times of crisis and uncertainty. A pandemic certainly fuels conspiracy theories. Paranoia and panic are common.
Grand conspiracies ease our fears of a random chaotic universe. Believing that an entity is in control in the grand scheme of things, whether it’s the government or aliens is more comforting than believing that no one is in control. When no one is in control, that’s scarier.
Alan Moore said it best: “The main thing that I learned about conspiracy theory, is that conspiracy theorists believe in a conspiracy because that is more comforting. The truth of the world is that it is actually chaotic. The truth is that it is not The Illuminati, or The Jewish Banking Conspiracy, or the Gray Alien Theory. The truth is far more frightening – Nobody is in control. The world is rudderless.”
Those who tend to believe conspiracies are often those who feel powerless and disenfranchised. Conspiracy thinking correlates with feelings of isolation and helplessness.
Minorities are more likely to believe in conspiracy theories than non-minorities. A 2012 survey found the more someone believed conspiracy theories, the less likely they were to vote and be involved politically.
Ironically, believing in conspiracy theories can make those feeling powerless feel more powerful and in control. Conspiracy theories fill a need for certainty, understanding and explanation when there is uncertainty, confusion, ambiguity and unsatisfying information. This is called compensatory control.
The tendency to interpret events with deliberate intent rather than the cause of natural forces or random chaotic events is called hyperactive agency detection.
Researchers have found that people who believe conspiracy theories tend to be a little more hostile, cynical, paranoid and disagreeable than people who dismiss conspiracy theories. Conspiracy theories also appeal to those who are more narcissistic and slightly less educated.
People who are less satisfied with life in general are more likely to believe conspiracy theories.
That’s not to say that these traits apply to all conspiracy theorists. We are all vulnerable to conspiracy thinking.
The just-world fallacy can play a part in conspiracy thinking. We don’t want to believe that bad things can happen by chance. We like to think that there is intent behind everything, that everything happens for a reason and that everything is controlled.
Realizing that things can happen to us by chance or by accident and that the world is messy and chaotic is deeply unsettling.
Conspiracy theories are notorious for flawed logic. They violate Occam’s razor as they are riddled with unnecessary and invalid assumptions.
Conspiracy theorists underestimate how difficult it would be to pull off a gran conspiracy theory with hundreds of thousands of people with different motivations and not one whistleblower. Competence and power of a large secret group of people is vastly overestimated.
“The government kinda sucks at keeping secrets.” ― Bill Nye
Grand conspiracies are mathematically impossible. They are too big not to fail. The larger and more complex the conspiracy theory is, the higher the probability of failure. Grimes’s mathematical model conservatively predicts that grand conspiracies would fail within about four years. Realistically, they would fail far more quickly.
“The sane understand that human beings are incapable of sustaining conspiracies on a grand scale, because some of our most defining qualities as a species are inattention to detail, a tendency to panic, and an inability to keep our mouths shut.” ― Dean Koontz
Conspiracy theories are not grounded in reality. There is no solid evidence to support conspiracy theories otherwise they would not be conspiracy theories. By design, conspiracies are unproven. That’s what makes them conspiracy theories.
Conspirators are competent enough to be convenient for theorists. They are powerful and secretive except when they mess up and reveal just enough “proof” of a conspiracy. As Loren Collins put it, the conspiracy tends to be “exactly as competent and powerful as the conspiracy theorist needs it to be.”
Conspiracy theorists partake in what is called anomaly hunting. Anomalies are strange details that cannot be immediately accounted for or explained. Coincidences are actually incredibly common and not rare at all. Our brains are wired to notice them. We are likely to find odd occurrences and coincidences when we look for them. If you look hard enough, you’re bound to find anomalies anywhere if you look hard enough. Nothing is perfect. Weird things happen all the time.
We tend to ignore random chance when something seems meaningful or we want a random event to have meaning. This is called the Texas sharpshooter fallacy. Apophenia is refusing to believe in chaos and noise, in coincidence and chance.
When we are unsure of something, we are more likely to accept strange explanations. The argument from ignorance is an argument that a belief or claim is true because we don’t know that it isn’t true.
The problem with conspiracy theories is that no amount of evidence “disproves” conspiracy theories because all the evidence is fabrication and just part of the conspiracy. Every person arguing against conspiracy theories is either brainwashed or a part of the conspiracy (which would be an ad hominem logical fallacy attack).
Conspiracy theories use black and white thinking as well as an us vs. them narrative. Us vs. the enemy. This is dangerous because they can push extremists to more extreme and potentially violent directions toward the identified enemy. Conspiracy theories often play into the ultimate fantasy of good versus evil.
Conspiracy theories cause harm when they distract from real and legitimate issues and social change. Fighting an imaginary issue or enemy helps no one. They also contribute to dangerous and harmful misinformation and pseudoscience that hurts many, not just the conspiracy theorists.
When conspiracy theories and misinformation take over, truth in society is no longer valued and trust is diminished. This becomes a disaster for a healthy democracy.
To avoid conspiratorial thinking, it’s important to have respect for reality and value facts over assumptions. Realize that the world is chaotic and messy. Take the time to make sure the information you come across is based in fact and truth and not conspiracy theories. Believing that you have power and autonomy in your life and society can help reduce the helplessness common with conspiracy thinking. Taking action and being involved in politics and your community can also help. Use Occam’s Razor.
Blind distrust and scapegoating of the media
It has become incredibly common to blame and scapegoat “the media” for nearly everything. Scapegoating the mainstream media is cynical and unhelpful. Yes, the media can be incredibly flawed which I’ll get to, but the media is made up of diverse content and people with completely different motives. Media is defined as different means of communication that reach people widely. This includes books, television, radio, newspapers, magazines and even blogs like this one.
This kind of thinking is simplistic and overgeneralized. It’s easy to just blame the media or blindly criticize and distrust it altogether.
Scapegoating the (mainstream) media can go hand in hand with conspiracy thinking and the spread of misinformation. It’s treating the media as a single entity and enemy.
Knowing how to decipher good, reliable and accurate journalism and media from the bad is where the skill of media literacy, critical thinking and scientific skepticism come into play.
Most media are biased. Because humans are all biased and the media is run by humans, bias is nearly impossible to avoid. However some media sources are more biased than others. Bias is not necessarily a bad thing, but heavy or extreme bias can lead to an agenda which can lead to distortion of the truth and even cherry picking data or a specific narrative pushed that may not be realistic or accurate.
Here is a helpful chart that shows the bias and factual reporting of many popular different news sources.
Here are some resources for checking bias and factual reporting on individual sources:
The Design of the Internet & Echo Chambers
Mixed in with confirmation bias and the illusory truth effect, the algorithm of many social media channels can create an echo chamber and feedback loop of harmful misinformation.
People will often blame and scapegoat social media and the internet when it comes to misinformation, but the design of the internet and social media is just one part that contributes to the complex problem of misinformation and conspiracy theories. A lack of critical thinking and media literacy is fueled by social media echo chambers.
False or harmful beliefs are reinforced through the algorithm.
False Balance or Bothesidesism
False balance is similar to false equivalence, but it specifically targets poor journalism and reporting.
Under the guise or attempt of “fair reporting”, reporters and journalists will present both sides of an issue where only one side is correct. This practice is deceptive, misleading and undermines honesty and accuracy.
It happens when an influential movie celebrity or a spiritual guru with millions of fans and followers is given equal credibility against a CDC scientist. This practice contributes to the spread of misinformation and pseudoscience.
Make sure to read the entire article, not just the title of the article. Keep in mind that headlines may be clickbaity and exaggerated. They can also be deceptive, misleading and sometimes just plain false.
Avoid getting news from social media. Research shows that those who mainly get their news from social media are less informed and knowledgeable.
Take note of if something is an ‘opinion piece’. Opinion pieces are generally not based in fact and tend to be more biased.
Take the format of the content into account. Be wary of information delivered in a way that is not easy to fact check such as youtube videos, documentaries and speeches. These can be less reliable sources, especially if there is a lack of reputation information, expertise or valid references and sources.
Always fact check. Especially before sharing anything.
Red flags to look out for in media:
The source or creator of the content claims to have all the answers or claims to be the only source of truth Scapegoating and criticizing the mainstream media is common.
Claims are repeated without information and evidence to back up the claims
There is a lack of references and sources for claims
There is little to no information about the source or the creator of the content
The source has little to no reputation information
The source has a negative reputation Make sure to check the media bias of sources using the links listed or you could always google “[source name] media bias” or “[source name] factual reporting”
The headline is clickbaity, exaggerated or appealing to emotions
The content is appealing to your emotions, especially outrage Be aware of your reactions. Fake news stories try to provoke strong emotional reactions
The content has or is making claims of conspiracy theories Often, conspiracy theories can be disguised as “just as asking questions” or “JAQing off”.
The content is making extraordinary claims or promises
The content is mysterious or claiming mysteries (“scientists can’t explain this”)
Excessive punctuation, all capital letters and multiple spelling and grammar errors
Always Double Check Before You Share!
“Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect: like a man, who hath thought of a good repartee when the discourse is changed, or the company parted; or like a physician, who hath found out an infallible medicine, after the patient is dead.” ― Jonathan Swift
Help stop the spread of harmful misinformation and conspiracy theories by fact checking before you share anything.
Value truth and honesty over confirming your beliefs. Be vigilant especially with content that you already agree with. Avoid the temptation of sharing before checking for accuracy, truthfulness, and honesty.
Here are some fact-checking resources, some of which I already shared above.
Sign The Pro-Truth Pledge
To combat misinformation and bring back truth into society, I invite you to join the Pro-Truth Pledge.
The ProTruth Pledge asks you to commit to 12 behaviors that promote truth and honesty in society and media.
I’ve signed and I encourage you to as well! You can also see the organizations and public figures including government officials who have signed.
After signing, share and encourage others around to sign as well.
While learning critical thinking and scientific skepticism can be humbling and it’s important to practice humility, take pride in gaining agency and autonomy by deciding to take the time to learn these skills. It can certainly be empowering!
Keep in mind that critical thinking is a process, a habit and a skill. It’s not something you learn and read about once and it’s with you. It takes time. It takes practice. It takes continuous learning. And it takes attention and energy. Literally, it’s work for your brain. We’ve only scratched the surface here.
I encourage you to continue your learning and education by checking out the recommended reading list below. I may be a bit biased, but I find this topic fascinating.
Please make sure to share this article to help others learn critical thinking and stop the spread of misinformation.
References and Recommended Reading