22 May 15 Why you shouldn’t believe everything you read.
You would think you are a rational human being. That you’re able to assess new information in a completely logical way. In fact, often, the opposite is true. But don’t worry, you’re not alone – we’re all the same. Let me explain.
There’s lots of advice out there in the online marketing world:
- “You should blog 2-3 times per week at a minimum.”
- “Make sure you include a hashtag in every tweet.”
- “Your landing page will convert better if it has a video.”
- “Never publish your best content on someone else’s website.”
We hear this kind of advice bandied about all the time. In reality, advice like this applies in some contexts, but not in others.
You think you’d know this, and most of the time, you probably do.
But sometimes you read an article that contains new, controversial, or subjective information, and the way you interpret that information is anything but logical.
Sometimes, you fall into the trap of believing things that may not be completely accurate, and you don’t even know you’re doing it.
What’s going on?
What you believe in and the decisions you make are often influenced by cognitive biases (such as the streetlight effect), heuristics and logical fallacies that shape the way you think. Let’s look at a few examples.
Confirmation bias
You tend to have preconceived assumptions about how things work. When you hear information about this particular topic – it could be a story or any other type of information – you tend to pay attention to parts of that information that confirm your existing beliefs, and ignore things to the contrary.
This is called confirmation bias.
Example: Imagine you believe that long-form landing pages convert better than short-form landing pages. You come across an article that provides various case studies on the topic, all of which support long-form landing pages. In the article’s comments section, many of the comments call out the article as bad advice, citing examples in which short-form landing pages are better. When you pay attention to the information in the article (as this confirms your beliefs), and ignore the arguments in the comments, that’s confirmation bias at work.
The availability heuristic
What causes the most human deaths – sharks, or cows?
If you said “sharks”, you may have fallen foul to the availability heuristic. This is a mental shortcut that operates on the notion that if something is easily recalled, it must be more important than alternative solutions, options or explanations that are not as readily recalled.
In simple terms, when choosing from a set of options, you’re most likely to choose the option for which you have the most information.
Your instinct to say “sharks” is because media coverage on shark attacks is high (thus making them easy for you to recall). Cow-related deaths get virtually no media coverage, even though there are significantly more deaths from cows than there are from sharks.
Example: Imagine you’ve decided it’s time for a new CRM system. Your old system is dated, and doesn’t have the features you need. You keep seeing blog articles and reviews on a particular CRM provider, and make the decision to go with them. In reality, there’s another system you know about that could be perfect for your requirements. But you just went with the one that was talked about the most. The availability heuristic influenced your decision, nudging you towards the option that was easiest to recall.
The argument from authority
The status and credentials of an individual greatly influence your perception of that person’s message. If a person is known to be an authority on a topic, you’re more likely to believe that person’s comments on the topic.
You’re probably thinking, this makes a lot of sense. It does. It is perfectly natural and logical to believe them. This is called the argument from authority (often referred to as appeal to authority).
Frequently, however, the argument from authority can be a logical fallacy in which you’ll believe what that person has to say on topics outside their scope of expertise. Whenever you see a major sports personality on a TV advertisement helping to promote shampoo, or cars, or energy drinks, the advertiser is trying to tap into your appeal to authority.
Example: Imagine you follow an author who is well-known in the industry for her articles about Google Adwords. You’ve been following her for a number of years, and her articles about Google Adwords are always highly informative. She is truly an authority on the topic. But then one day, she publishes an article about iPhone app development. Do you believe her advice on this topic? If so, you may be subject to a fallacious argument from authority.
The argument from ignorance
Do you believe in aliens? Or reincarnation? Or the Loch Ness Monster? If you said either “yes” or “no” to those questions, your decision is subject to a logical fallacy referred to as the argument from ignorance (often referred to as appeal to ignorance).
The argument from ignorance occurs when you decide something is true (or false) because you can’t find evidence to the contrary.
When thinking about the existence of aliens, your answer isn’t limited to two options: “yes”, and “no”. There’s actually four options: “yes”, “no”, “unknown” (we don’t know at the moment), and “unknowable” (we’ll never know).
The argument from ignorance exists when you argue for or against a claim, even though there is no evidence to prove or disprove that claim.
Example: For those of you who work in SEO, you’ll have experienced this plenty of times. As you know, a large part of search engine algorithms are under wraps. While we have a pretty good idea how things work, much of it is based on theories, educated guesses, and experience (“it worked for these 20 websites, it must work for this one, too”). But there are often those who claim to have cracked the code. Any time you hear someone proclaim unequivocal understanding of how something affects search engine rankings (even though you know it’s just a theory), that’s a good example of an argument from ignorance.
Subjective validation
Consider the following passage of text:
You have a need for other people to like and admire you, and yet you tend to be critical of yourself. While you have some personality weaknesses you are generally able to compensate for them. You have considerable unused capacity that you have not turned to your advantage. Disciplined and self-controlled on the outside, you tend to be worrisome and insecure on the inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You also pride yourself as an independent thinker; and do not accept others’ statements without satisfactory proof. But you have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, and sociable, while at other times you are introverted, wary, and reserved. Some of your aspirations tend to be rather unrealistic.
Do you think that passage of text accurately describes you? Most people would respond “yes”. If you’re in that camp, you’re experiencing something called the Forer effect.
This passage of text was written by psychologist Bertram R Forer to investigate aspects of subjective validation, a cognitive bias that exists when you consider a statement or another piece of information to be correct if it has personal meaning or significance to you.
Psychic readings, horoscopes, and even some types of personality tests – they all involve vague statements from which you apply personal significance and create your own meaning. This is subjective validation at play.
Example: It’s pretty hard to come up with a specific example for this, because it actually happens almost every time you read an article. Whenever you read something, your brain is taking in the information and then trying to connect the dots between that information and your own personal experience, situation, or aspirations. Your brain wants to find a connection and will work hard to find one, even if there is no link. That’s subjective validation.
Conformity
Have you ever been in a team meeting where your manager asks the team a challenging or controversial question, and everyone nods in agreement, even though you know they disagree? If so, there’s a very good chance that everyone is conforming. You included.
You’ll happily follow the crowd if this means you won’t be challenged. This is called conformity.
Example: Imagine you keep coming across articles about Instagram, and how brands are benefiting from the platform. Everybody in your industry is talking about it. You really don’t think it is applicable to your business, but everyone else is doing it, so it must be a winner, right? Surely people will think you’re an idiot if you don’t jump on this bandwagon. So, you spend significant time and resources building up your profile, only to find you get zero engagement from the community. Maybe you did it wrong? Or maybe, just maybe, you got caught up in the hype and conformed to what everyone else was doing, even though your instinct told you it wasn’t going to work.
Problems, big and small.
It’s not hard to see how these biases, heuristics and fallacies can be problematic. In some situations, poor decisions can be inconsequential. But in others, the consequences could be catastrophic.
Blindly believing everything you read can cause you to make poor strategic decisions, waste time on tactics that don’t matter, ditch tactics that actually work, write crappy blog articles that your audience don’t care about, work with the wrong agencies, take on the wrong customers… you get the point.
What to do about it.
It’s not all doom and gloom. Cognitive biases, heuristics and logical fallacies are an everyday part of life. In many situations they are useful, and can even be a survival instinct. We’re all subject to them.
That being said, it is possible to reduce the impact of these fallacies and make smarter decisions. Simply being aware of them is a good first step. But there’s other things you can do, too.
Next time you read an article that contains new, controversial, or subjective information, try the following:
- Take a step back, and think about it logically. Have an open mind about what you’re reading. Don’t just jump straight to your first conclusion.
- Read it again, and read it slowly. Like a lot of people, you probably scan articles rather than read them word-for-word. Stop, go back to the top, and read it again. Don’t let your mind jump to conclusions. Take in the words.
- Sleep on it. Come back to the article tomorrow, or in a few days’ time, and read it again.
- Ignore who wrote it. Try to imagine you’ve never heard of the author before, or read any of their articles. Pretend you are reading an article by a completely new author. Assess the article based on its content, not on who wrote it.
- Try to disprove it. Even if you initially agree with the article’s point of view, ask yourself why it could be wrong. Scientists try to do this all the time; it’s an important part of the scientific method. Don’t just try to prove something, try to disprove it, too.
- Assess the credibility of the article. Does the article contain references to other articles? Are those articles credible? Is the article based on someone’s opinion, or valid scientific research?
- Do your own research. Find other articles to confirm or challenge the perspective.
- Speak to other people and get their opinion. See what other people think. Note: Be aware of their biases when listening to the responses.
One last thing.
This article is somewhat a paradox. While part of me wants you to believe everything you just read and then share this article on your favourite social network, there’s a chance you could be interpreting the above advice in an illogical, irrational way. Heck, there’s even a chance that in researching this article I’ve fallen foul to my own biases, and parts of this article could be inaccurate. I don’t think that’s the case, but nobody is perfect.
How you interpret and apply this information is what matters. Take a step back, and think about it.