I have to admit that I'm still learning to say I don't know. The unconscious need to express an opinion, even when I don't know, is very powerful.
Nate Hagens wrote:As cock-sure trainees (in 1992), fresh out of MBA school, our oft-times swaggering attitudes were held in check, and eventually dampened, by a cultural meme that existed in our division at Salomon Brothers (which I was to later understand, was not ubiquitous on Wall St.) We were often purposefully asked a series of questions - the first couple answerable if one had done their homework, but the third or fourth question being very difficult, and most times unanswerable. Our natural tendency was to look smart, speak with authority and confidently 'sell' an answer (or guess) to these tough questions. But our instructors (a rotating collection of senior people at the firm) came down on us HARD if we ever guessed, even if we guessed right. The correct answer, we were told was "I don't know, but I can research it and get back to you". Perhaps the thinking was that the richest clients on the planet could smell BS a mile away, and straight talk was not only ethical, but would lead to more business. This concept of humility was drilled into us to the point where even in social situations outside of work, we trainees were conditioned to say "I don't know" rather than BS our way through some smug, but wrong, response.Planck Problem - From Michael Shermer - How Thinking Goes Wrong
In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: "educated, intelligent, and successful adults rarely change their most fundamental presuppositions" (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become (and remember, we all tend to look for and remember confirmatory evidence, not counterevidence), the greater the confidence in their ideologies. The consequence of this, however, is that we build up an "immunity" against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: "An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning" (1936, p. 97).
Psychologist David Perkins conducted an interesting correlational study in which he found a strong positive correlation between intelligence (measured by a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a strong negative correlation between intelligence and the ability to consider other alternatives. That is, the higher the IQ, the greater the potential for ideological immunity. Ideological immunity is built into the scientific enterprise, where it functions as a filter against potentially overwhelming novelty. As historian of science I. B. Cohen explained, "New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result" (1985, p. 35).
Fast forward 15+ years. Being right is still correlated with social respect and success. Being wrong drops us a notch socially, if not to others, at least to ourselves (some more than others). With the explosion of sub-disciplines in science, it seems that the more data we have on various aspects of our environment, our economy, and our energy situation, the more opportunity those with charisma, persuasive skills, money, connections, etc. can latch on to a particular datapoint or study or belief and leverage it into an attitude that becomes more widely held. I used to think that facts were incontrovertible, but "science" (and I have considerable personal anecdotes to this effect) has more interplay with belief systems and political power, than I once naively believed.
The internet has expanded our tribal numbers far in excess of our brains capacity to effectively process on an already full plate. Personally, I get about 1,000 non-spam emails per week. Since my own past crosses many demographic boundaries, I now get correspondence from rich conservatives, liberal environmentalists, risk-prone traders, philosophers, oil executives, tree-huggers, brain scientists, farmers, politicians, and old friends. (And let's not forget family). As such, I have been increasingly amazed at both the disparity and the strength of opinion/belief in what ails the modern world, if anything, and what these various 'circles' believe is our best path forward.
I am a global warming agnostic - a) primarily because I haven't the necessary time to become adequately fluent in the complex issues involved in climate science and b)even if climate change is proven to be non-anthropogenic, there are myriad other Liebigs limiters to scaling up the current conspicuous consumption/energy paradigm globally. I know three IPCC scientists personally (one on my thesis committee). I also have several close friends who think global warming is a hoax - who continually send me data on Maunder Minimum, MWP, solar/sunspot cycles, Pacific Decadal Oscillation, and other things that look impressive but I don't fully understand. Occasionally, when I am emboldened, I cross pollinate 'new info' between these various tribes.
Two emails this Tuesday led to an 'aha' moment, and are the genesis for this post. First, I sent a one hour video presentation given by an astrophysicist on the natural drivers of climate change to an IPCC friend. Later that day we spoke. His reply "I watched 5 minutes of it and it mentioned Maunder Minimum so the rest was likely irrelevant too - I get 50 of these a week Nate I just don't have time for such crap - please stop sending it". Not being an expert, I didn't respond and we moved on to talk about a water/energy paper I am completing. That same night, I had sent a brand new pdf on methane hydrates to another scientist friend of mine (who thinks climate change is 90% natural in origin). He lashed out at me with an email 5 minutes later (pdf was 40 pages so he couldn't have read it) saying that "global warming has nothing to do with science - it is only science that is finally debunking the politicization of anthropogenic climate change". How could this be? Two VERY smart people, not willing (or not able) to incorporate new data into their belief systems.
I previously wrote an essay about some possible explanations surrounding resistance to belief change. One phenomenon was cognitive load - that we can only effectively handle seven chunks of information at once. This may play a bigger role than I thought - we live in the age of the internet, and larger tribes are taking up a subsequent larger % of our cognitive processing ability - our brain, by adhering to previous beliefs, stays in the moment and says 'no maas'. (Research suggests that people already holding 6 or 7 chunks of information in their heads, do poorly on math problems and choices involving delayed discounting - e.g. if your mind is maxed out, you care about the present more than the future compared to controls).
Conclusion
In a society assailed from all angles with social and environmental problems, and information (in addition to gambling, pornography, and shopping) available 24/7 on the internet to increasingly 'full' minds, we are moving further and further away from a cultural ability to say "I don't know". Such an answer implies weakness, rather than wisdom, and someone on TV, someone testifying to Congress, or someone publicly asked for answers to our financial or environmental problems replying "I don't know but I can find out and get back to you" would be quickly replaced by someone with a pithy, intelligent, or confident answer (with all three, they'd be branded an 'expert' and invited back). Only history will show that uncertainty could have played a much bigger cultural role than it has, and that the Precautionary Principle should perhaps have trumped the Planck Problem, instead of vice versa.
Campfire question:
How will the belief systems of scientists, politicians, civic leaders, average citizens, etc. converge on a 'best path' forward that integrates energy, economics, equity and the environment?
On the eve of the 4th anniversary of this website, whose mission is to provide a forum for logically and empirically discussing energy and our future, I'll admit that "I don't know".
Previously: Peak Oil, Believe it or Not