Opinions vs. assumptions
I once came across a tweet from Jason Fried, the founder of Basecamp, that read: “I used to think everyone had an opinion, but now I think everyone has an assumption.”
Until reading Jason’s tweet, I likely would have operated as if opinions and assumptions were the same thing.
Both exist in a place where definitive truth is not wholly clear or is unknown.
I can form an opinion about something without all the facts, and yet, for most topics, I can hold that opinion loosely if new evidence presents itself.
For example: I can have an opinion about the best place to get sushi in Denver, Colo., but I can change my opinion if a new, better restaurant captures my tastebuds.
Similarly, I can have an experience, and out of that experience, I can form an assumption about future events, despite not knowing exactly what’s to come.
If a contractor shows up late to my house three times in a row, I can reasonably assume that they’ll be late the fourth time, and so I might adjust my own schedule accordingly.
Therein lies the difference between an opinion and assumption, and it has everything to do with how they guide your actions and behavior.
And this speaks to what Jason was hinting at with his tweet.
An opinion is most often born out of your own personal preferences, your own unique taste, or your own desires, where it is accepted that others will have different opinions than your own.
And an assumption takes your own personal experiences and begins to hard-code them into your brain as fact, which in turn affects how you interact with others, allowing snap judgments, stereotypes, and incomplete data to guide your behavior.
Where you get into trouble is when you start believing your opinions are evidence enough for an assumption, thought, or idea to be true, which most often occurs not when we’re talking about sushi restaurants or tardy contractors, but issues of moral makeup.
When do you know you're wrong?
Astronomer Neil DeGrasse Tyson has said, “One of the great challenges in this world is knowing enough about a subject to think you’re right, but not enough about the subject to know you're wrong.”
This is what’s known as the Dunning-Kruger effect. We have a tendency to overestimate how much we know about a subject when we gain a little bit of knowledge about it. As we become more informed about the subject, we begin to lose confidence about what we know since we are exposed to just how much information and nuance are contained within the subject.
Finally, as we continue to gain more knowledge and information, we understand what we were wrong about previously, and are able to articulate with authority what is actually true about the subject.
However, even then, sometimes knowing you’re wrong isn’t enough to overcome biases and previously held opinions.
We live in a world where the phrase “alternate facts” has seriously passed across someone’s lips. And this isn’t a political commentary, I’m simply addressing a larger societal issue.
A fact is defined as “something that actually exists.”
Alternate facts are, quite literally by the rule of definition, make-believe. To be alternate from actual existence is to be fantasy and untrue.
But this is what happens when your opinions of morality seemingly conflict with someone else’s view, fueling your assumptions about the other person and their intent, leading to wildly different behavior and beliefs, a.k.a. alternate facts.
In episode 108 of my podcast, Dr. Kurt Gray described how humans are all of the same “moral mind,” where we are driven to see ourselves and those we love protected and safe. It is in our divergent definitions of “those we love” and “protected and safe” that cause us to ascribe positive value to our own morals, and to assume ill-intent of those who define these things differently.
Thus, opinions form, and out of those opinions, assumptions are adopted.
When this happens, you use that assumption as evidence of truth or fact. You hunker down and defend your belief as if belief alone proves certainty.
When you have an opinion, and that opinion fuels an assumption about another person, and you never explore that assumption further and deeper to see if what you believe is actually true, your ego can go into lock-down mode.
Curiosity killed the ego
On episode 27 of my podcast, ABC Chief National Correspondent Matt Gutman spoke about his career in journalism and how curiosity drives him.
“I’ve had some of the most amazing scoops of my career just by talking to random people and learning things,” he said, adding that “being nice to people has led to the greatest successes in my life.”
And that’s how we avoid the challenge Neil DeGrasse Tyson warned about ... thinking we know enough when we actually know too little.
You must get curious about what it is you believe to be true, and what someone else’s experience might tell you about the world around you. Dig in and ask questions. Be curious. Strive to learn more, even if the truth — the facts — don’t line up with your opinions.
Saying “I don't know, but I'm going to find out” is one of the strongest, most powerful things you can say.
It takes being curious about the world apart from your opinions to ask the question, “How might my assumptions be incomplete?”