How our personal backgrounds and mental shortcuts change how we see the truth.
If two people look at the number '6' from opposite sides, one sees a 6 and the other sees a 9—who is lying? The truth is often less about the facts and more about where you are standing.
Your brain is a master of efficiency. Every second, it processes millions of bits of data. To keep up, it uses heuristics—mental shortcuts that help you make quick decisions. While helpful, these shortcuts often lead to cognitive biases. A bias is a predictable 'glitch' in your thinking that causes you to see the world inaccurately. Think of it like a filter on a camera: it doesn't change the scene, but it changes how the colors and details appear to the viewer. In philosophy, we study these to understand the gap between objective reality (the world as it is) and subjective perception (the world as we see it).
Quick Check
What is the primary difference between a 'heuristic' and a 'cognitive bias'?
Answer
A heuristic is the shortcut itself (the process), while a cognitive bias is the resulting error or tilt in thinking (the outcome).
1. You watch one video about a specific hobby. 2. The algorithm (a digital heuristic) shows you more of the same. 3. You begin to think everyone loves this hobby because you never see the opposite. 4. This is a 'filter bubble' that strengthens your confirmation bias.
Quick Check
If you only read news that you already agree with, which bias are you practicing?
Answer
Confirmation Bias.
Philosophers use the term situatedness to describe how our unique background—where we grew up, our culture, and our experiences—acts as a 'home base' for our perspective. No one is a 'blank slate.' Because we are all situated differently, two people can witness the exact same event and have two different 'truths.' This doesn't mean facts don't exist; it means our perspective acts as a frame. To find the most objective truth, we must combine multiple perspectives, much like how a 3D image is created by combining two slightly different 2D views.
1. Three blind men touch different parts of an elephant. 2. One touches the trunk and says, 'An elephant is like a thick snake!' 3. One touches the leg and says, 'No, an elephant is like a tree trunk!' 4. One touches the side and says, 'You are both wrong; it is like a wall!' 5. Each man has a 'truth' based on his perspective, but none has the whole truth until they share their data.
How do we fix our biased lenses? The answer is metacognition, or 'thinking about your thinking.' By pausing to ask, 'Why do I believe this?' or 'What evidence would prove me wrong?', we can trigger our logical brain to override our biased shortcuts. This is often called intellectual humility—the recognition that your perspective is limited. Scientists use 'double-blind' studies to remove bias, and you can use 'perspective-taking' to do the same in daily life. If the probability of being right is , we increase it by actively seeking (the probability we are wrong).
1. You are working on a group project and everyone agrees on an idea immediately. 2. You recognize this might be 'Groupthink' (a social bias). 3. You intentionally assign one person to be the 'Devil's Advocate' to find every possible flaw in the plan. 4. By trying to 'break' your idea, you actually make the final result much stronger and more objective.
Which term describes the 'mental rules of thumb' our brains use to make quick decisions?
If a person only notices the 'bad' things their rival does but ignores the 'good' things, they are experiencing:
True or False: Being 'situated' means it is impossible for humans to ever be 100% objective.
Review Tomorrow
In 24 hours, try to explain the 'Blind Men and the Elephant' story to a friend and how it relates to the concept of perspective.
Practice Activity
The 'Steel Man' Challenge: Find an opinion you strongly disagree with. Try to write down the three strongest arguments for that opinion without using insults.