Five questions for Philip Fernbach
You think you know it. But you don’t. Neither does your colleague, or family members or friends, or the pundits you watch on TV who agree with your political ideas. Human brains don’t have the capacity to know everything about everything in our complex world, and so we rely on communities to share knowledge and enable us to accomplish our goals.
That all sounds good, but there can be a downside to such thinking. Philip Fernbach, professor of marketing in the Leeds School of Business at CU Boulder, says we tend to overestimate how much we know, which can be troublesome, and we tend to believe whatever our community believes.
He co-authored (with Steven Sloman) “The Knowledge Illusion: Why We Never Think Alone,” which explains how and why we think the way we do. His research interests encompass a variety of consumer behavior, including causal reasoning, probability judgment, financial decision-making and moral judgment.
Fernbach is the lead author of a study published Monday in Nature Human Behaviour showing that people who most oppose genetically modified food know least about the science behind these products.
(View Fernbach’s TED talk at https://www.youtube.com/watch?v=jobYTQTgeUE, read his New York Times op-ed at https://www.nytimes.com/2017/03/03/opinion/sunday/why-we-believe-obvious-untruths.html and a book review at https://www.nytimes.com/2017/04/18/books/review/knowledge-illusion-steven-sloman-philip-fernbach.html for more information.)
Fernbach also is co-director of the Center for Research on Consumer Financial Decision Making, and an affiliate of the Institute of Cognitive Science and the Center for Ethics and Social Responsibility. In 2017, he received a fellowship grant to conduct a two-year research project, The Cognitive Basis of Extremism, to explore how to improve public discourse by making people aware of the causes of extremism and ignorance.
He studied cognitive science at Brown University before coming to CU as a postdoc at the Center for Research on Consumer Financial Decision Making.
“It’s actually atypical for a cognitive scientist to end up in a business school,” Fernbach said. “This is kind of a dream job for me and I feel really fortunate that it worked out this way. I love being in the business school and I love being in Colorado.”
He has earned numerous awards, including in 2018, the Association for Consumer Research Early Career Award for Contributions to Consumer Research and the Provost’s Faculty Achievement Award at CU Boulder.
Another career highlight for Fernbach is his speaking engagement at the 2017 Athens Democracy Forum. “I followed Kofi Annan on stage and it was an amazing experience talking about these issues in front of an incredible audience of journalists and leaders.”
Outside of work, his passions are bluegrass music and ice hockey.
“Flat-picking is a real obsession of mine and there’s an amazing bluegrass community in Colorado and the Denver and Boulder area,” he said. “It’s just a great community of players that has been a rewarding part of my life.”
He grew up playing ice hockey, which he calls the “most fun sport,” and has now passed that passion onto his children.
1. Are the issues you lay out in “The Knowledge Illusion” a new way of looking at human thinking? What are the benefits and disadvantages of the way we think?
The ideas in our book have been around a long time, but they haven’t been popularized. Most people in the cognitive science community do not think about the world in the way that we are putting forth in the book, and so while a lot of these ideas are not new to fields like economics and social cognition, what we have done is pull everything together – including research we have done – to paint a picture of the mind that is quite different from the way that it is normally thought about.
The book has two major themes: One is the idea that individuals don’t know very much about the world, and we overestimate how well we understand things; the other is the idea of the community of knowledge. Most of our knowledge is not stored in our own heads. Humans are great at forming communities where people can specialize, and the knowledge is distributed across the community. Each member of the community has little bits of knowledge and we have the capacity for collaborative cognition, where the community can be a lot smarter than any one individual alone.
There are important implications of this way of thinking in the real world in areas like education and politics, for instance. Education is about giving people a lot of information, but what our book says is that humans aren’t built to store a lot of information, and so we have to take different approaches to learning.
There also is a paradox at the heart of the human condition, which is the idea that as human beings, we do incredible things. Just look at our technology and our ability to organize society and control the world in all these incredible ways. But individuals can be very irrational; we can be extremely ignorant; we can be very centered in our ways; and we are not always responsive to evidence. You also see large groups of people that end up believing things that are easily verifiably false.
Human thinking leads to both upsides and downsides. The upside is that we have the ability to store knowledge in a community to collaborate and be able to achieve and pursue goals that are extremely complex even though no one individual understands it all. The downside is that when we form these communities, we end up taking on the views of our community, and it is very hard for the individual to adjudicate what is correct and what is incorrect.
Additionally, most things are so complicated that individuals can’t judge whether the community has it right or not and we end up in these situations that are kind of like a house of cards, where an entire community tends to believe something false because everybody in the community has this feeling that they understand the issue despite the fact that no one does.
2. Throughout history, there have been leaders and scientists and others who have bucked the system to make important changes or discoveries. Are these people somehow different from the rest of us? Or what happens to change a person’s mind about something they believe in?
This idea of an intellectual rebel is one we are thinking about as we move forward with our research and writing. To some extent, it is atypical to question what the community tells the individuals. Most people are not very deliberative about bucking the community. One of our hypotheses is that in order for people to take a counter-culture view, they have to have some sort of community support.
I actually went to the Flat Earthers Conference in Denver a few weeks ago. These are people who buck the system view, but they have a lot of community support from like-minded individuals to formulate their theories.
Historically, when we laud people as being these kinds of mavericks, it turns out that the reality is more complex. They have a community of support of other people who are thinking similar thoughts, but I do think there is something special about people who are more contrarian and deliberate about questioning the status quo. It can lead to communities forming around these views that are outrageous like the Flat Earthers, but it is probably central to the scientific pursuit of knowledge since a lot of scientists question the status quo more than others do.
The question of changing our minds is a difficult one. When we take a view of the world, we’re really trumpeting what our community has to say. It is hard for people to change their minds because people can often discount information that is counter to what they are being told or to what their community believes because it is coming from experts that they don’t trust as much.
I do think that there are some people who are more open to the evidence than others, but for most people, it is hard to update beliefs in an objective way. We tend to be tied in to what we believe, and we have to see a lot of evidence and be open-minded before we start changing our views.
Humans are not aware of the gaps in our knowledge; we go through life feeling as if we understand things a lot better than we do. When people are confronted with counter evidence, they do update their beliefs to some extent, but probably not as much as they should.
3. Understanding that we overestimate our knowledge and tend to believe our community views, how can we be smarter consumers?
There are different types of consumption. One area is consumption of information. We see around us all the time that we can be led to very bad places if we’re not vigilant about checking our understanding. We can be led to false beliefs or led to believe in fake news or led to believe that products are better than they are or fooled into believing in pseudo-science claims.
The answer to the question is counter-intuitive. We shouldn’t go and learn everything about every product we want to buy or every news story we want to consume because that is not possible. The world is too complex. We need to take an expert’s word for things a lot of the time, but we can be more vigilant about checking the claims and our understanding and making sure our information is credible.
4. Are we letting people off the hook when we say the world is too complex to understand more about what is happening around us?
One of the major problems in our discourse is that we like to put this normative frame on these problems. Our first question is, “What is wrong with these people whose views are counter to what we believe or what the consensus is?” I actually think that is counter-productive. We shouldn’t assume there is something inherently wrong with people; instead, we should figure out what is going on and then we can figure out how to influence the situation and get people to beliefs that are more concordant with what the experts and science says.
We are not letting people off the hook; we are being realistic. All of these issues we are grappling with as a society are complex and a solution that says everyone needs to get a lot more knowledgeable about the issues is not realistic. We can’t learn enough to have a completely reasoned and justifiable view, and in the end, we have to rely on the community and the experts in the community to give us information.
People need to be more vigilant about checking their own understanding and not just blindly say that their community is right. We need to become more open-minded and a little less extreme in our positions around issues that we don’t understand very well.
It is wrong to imagine that there is something pathologically wrong with 50 percent of people in the country who are across the aisle. We tend to say that these people are idiots or evil, but usually the truth is much more nuanced. There are people at the extremes who have bad intentions or who are willfully ignorant, but for the most part, we overestimate the extent to which people on the other side are different from us qualitatively in terms of their morals or intelligence. I think we need to appreciate that everybody is relatively ignorant.
5. What are your current research endeavors?
One of the things I’m fascinated about is people’s attitudes on controversial scientific topics like genetically modified foods, global warming and vaccination. These are areas where a large portion of the population has views that are counter to the scientific consensus, and we’re working on where their views come from. One of the things we are finding is that a lot of the time, people who are most extreme have the least knowledge, or the lowest scientific literacy. But they claim to have the most knowledge, which means they think they understand the issues better than others.
Thinking we understand things better than we do can lead to overly extreme views. What we’ve found is that the effect of learning more about a topic pushes us to become more moderate and we understand the world is more complex than we thought.