Page 1 of 2

"I don't know"

Posted: 23 Mar 2009, 10:30
by Adam1
Nate Hagens has an interesting article on TOD.

I have to admit that I'm still learning to say I don't know. The unconscious need to express an opinion, even when I don't know, is very powerful.
Nate Hagens wrote:
Planck Problem - From Michael Shermer - How Thinking Goes Wrong

In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: "educated, intelligent, and successful adults rarely change their most fundamental presuppositions" (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become (and remember, we all tend to look for and remember confirmatory evidence, not counterevidence), the greater the confidence in their ideologies. The consequence of this, however, is that we build up an "immunity" against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: "An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning" (1936, p. 97).

Psychologist David Perkins conducted an interesting correlational study in which he found a strong positive correlation between intelligence (measured by a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a strong negative correlation between intelligence and the ability to consider other alternatives. That is, the higher the IQ, the greater the potential for ideological immunity. Ideological immunity is built into the scientific enterprise, where it functions as a filter against potentially overwhelming novelty. As historian of science I. B. Cohen explained, "New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result" (1985, p. 35).
As cock-sure trainees (in 1992), fresh out of MBA school, our oft-times swaggering attitudes were held in check, and eventually dampened, by a cultural meme that existed in our division at Salomon Brothers (which I was to later understand, was not ubiquitous on Wall St.) We were often purposefully asked a series of questions - the first couple answerable if one had done their homework, but the third or fourth question being very difficult, and most times unanswerable. Our natural tendency was to look smart, speak with authority and confidently 'sell' an answer (or guess) to these tough questions. But our instructors (a rotating collection of senior people at the firm) came down on us HARD if we ever guessed, even if we guessed right. The correct answer, we were told was "I don't know, but I can research it and get back to you". Perhaps the thinking was that the richest clients on the planet could smell BS a mile away, and straight talk was not only ethical, but would lead to more business. This concept of humility was drilled into us to the point where even in social situations outside of work, we trainees were conditioned to say "I don't know" rather than BS our way through some smug, but wrong, response.

Fast forward 15+ years. Being right is still correlated with social respect and success. Being wrong drops us a notch socially, if not to others, at least to ourselves (some more than others). With the explosion of sub-disciplines in science, it seems that the more data we have on various aspects of our environment, our economy, and our energy situation, the more opportunity those with charisma, persuasive skills, money, connections, etc. can latch on to a particular datapoint or study or belief and leverage it into an attitude that becomes more widely held. I used to think that facts were incontrovertible, but "science" (and I have considerable personal anecdotes to this effect) has more interplay with belief systems and political power, than I once naively believed.

The internet has expanded our tribal numbers far in excess of our brains capacity to effectively process on an already full plate. Personally, I get about 1,000 non-spam emails per week. Since my own past crosses many demographic boundaries, I now get correspondence from rich conservatives, liberal environmentalists, risk-prone traders, philosophers, oil executives, tree-huggers, brain scientists, farmers, politicians, and old friends. (And let's not forget family). As such, I have been increasingly amazed at both the disparity and the strength of opinion/belief in what ails the modern world, if anything, and what these various 'circles' believe is our best path forward.

I am a global warming agnostic - a) primarily because I haven't the necessary time to become adequately fluent in the complex issues involved in climate science and b)even if climate change is proven to be non-anthropogenic, there are myriad other Liebigs limiters to scaling up the current conspicuous consumption/energy paradigm globally. I know three IPCC scientists personally (one on my thesis committee). I also have several close friends who think global warming is a hoax - who continually send me data on Maunder Minimum, MWP, solar/sunspot cycles, Pacific Decadal Oscillation, and other things that look impressive but I don't fully understand. Occasionally, when I am emboldened, I cross pollinate 'new info' between these various tribes.

Two emails this Tuesday led to an 'aha' moment, and are the genesis for this post. First, I sent a one hour video presentation given by an astrophysicist on the natural drivers of climate change to an IPCC friend. Later that day we spoke. His reply "I watched 5 minutes of it and it mentioned Maunder Minimum so the rest was likely irrelevant too - I get 50 of these a week Nate I just don't have time for such crap - please stop sending it". Not being an expert, I didn't respond and we moved on to talk about a water/energy paper I am completing. That same night, I had sent a brand new pdf on methane hydrates to another scientist friend of mine (who thinks climate change is 90% natural in origin). He lashed out at me with an email 5 minutes later (pdf was 40 pages so he couldn't have read it) saying that "global warming has nothing to do with science - it is only science that is finally debunking the politicization of anthropogenic climate change". How could this be? Two VERY smart people, not willing (or not able) to incorporate new data into their belief systems.

I previously wrote an essay about some possible explanations surrounding resistance to belief change. One phenomenon was cognitive load - that we can only effectively handle seven chunks of information at once. This may play a bigger role than I thought - we live in the age of the internet, and larger tribes are taking up a subsequent larger % of our cognitive processing ability - our brain, by adhering to previous beliefs, stays in the moment and says 'no maas'. (Research suggests that people already holding 6 or 7 chunks of information in their heads, do poorly on math problems and choices involving delayed discounting - e.g. if your mind is maxed out, you care about the present more than the future compared to controls).
Conclusion

In a society assailed from all angles with social and environmental problems, and information (in addition to gambling, pornography, and shopping) available 24/7 on the internet to increasingly 'full' minds, we are moving further and further away from a cultural ability to say "I don't know". Such an answer implies weakness, rather than wisdom, and someone on TV, someone testifying to Congress, or someone publicly asked for answers to our financial or environmental problems replying "I don't know but I can find out and get back to you" would be quickly replaced by someone with a pithy, intelligent, or confident answer (with all three, they'd be branded an 'expert' and invited back). Only history will show that uncertainty could have played a much bigger cultural role than it has, and that the Precautionary Principle should perhaps have trumped the Planck Problem, instead of vice versa.

Campfire question:

How will the belief systems of scientists, politicians, civic leaders, average citizens, etc. converge on a 'best path' forward that integrates energy, economics, equity and the environment?

On the eve of the 4th anniversary of this website, whose mission is to provide a forum for logically and empirically discussing energy and our future, I'll admit that "I don't know".

Previously: Peak Oil, Believe it or Not

Posted: 23 Mar 2009, 10:46
by SILVERHARP2
It makes you wonder how intelligence should be measured , IQ suits the world of science but once you get into the world of risk and uncertainty slavishly being able to understand highly complex quantitative models may be a disadvantage. The best commentators/ predictors of this whole crises were people that by and large had an excellent knowledge of history, an understanding that all markets are cyclical , always have been always will be.

Posted: 23 Mar 2009, 10:53
by DominicJ
The futility of relying on "experts" pretty much nailed.

Posted: 23 Mar 2009, 11:51
by Ludwig
SILVERHARP2 wrote: The best commentators/ predictors of this whole crises were people that by and large had an excellent knowledge of history, an understanding that all markets are cyclical , always have been always will be.
Indeed. Without a knowledge of history and an understanding of human nature, you're never going to predict ANYTHING accurately that involves human beings.

It's interesting how many people seem unable to conceive of human history as an episode of natural history - who don't imagine that our civilisation might reach an end, because they think our civilisation is all there IS.

Re: "I don't know"

Posted: 23 Mar 2009, 13:57
by dudley
Nate Hagens wrote:
Planck Problem - From Michael Shermer - How Thinking Goes Wrong


I previously wrote an essay about some possible explanations surrounding resistance to belief change. One phenomenon was cognitive load - that we can only effectively handle seven chunks of information at once. This may play a bigger role than I thought - we live in the age of the internet, and larger tribes are taking up a subsequent larger % of our cognitive processing ability - our brain, by adhering to previous beliefs, stays in the moment and says 'no maas'. (Research suggests that people already holding 6 or 7 chunks of information in their heads, do poorly on math problems and choices involving delayed discounting - e.g. if your mind is maxed out, you care about the present more than the future compared to controls).

Previously: Peak Oil, Believe it or Not
The book How We Decide by Jonah Lehrer explains this stuff in more detail and I recommend it (except for the last chapter which is a tedious summary). Experiments have been done which show that, because we can only hold about seven chunks of information, having too much information can actually decrease the quality of decisions. Extra information can distract from the information that is important to the problem. For example, someone who knows a lot about the details of oil extraction and reserve growth could think that that information is much more important to judgements about peak oil than it actually is.

Re: "I don't know"

Posted: 23 Mar 2009, 23:18
by kenneal - lagger
dudley wrote:............... Extra information can distract from the information that is important to the problem. For example, someone who knows a lot about the details of oil extraction and reserve growth could think that that information is much more important to judgements about peak oil than it actually is.
':D' ':D' ':D' ':D' ':D'

Posted: 23 Mar 2009, 23:29
by RGR
DominicJ wrote:The futility of relying on "experts" pretty much nailed.
The next time you need some brain surgery, feel free to hand your significant other a butterknife and "Brain surgery for dummys", let us know how it turns out.

Posted: 23 Mar 2009, 23:36
by kenneal - lagger
Someone got stung into action there. ':D'

Re: "I don't know"

Posted: 24 Mar 2009, 02:25
by Bandidoz
Adam1 wrote:Nate Hagens has an interesting article on TOD.
Interesting, but incongruous. He's trying to connect two completely uncorrelated facets:

1) Some people don't like to say, "I don't know", but would rather be "confidently wrong".

2) People have an innate resistance to new ideas that challenge their experience and worldview.

If anything. the two "climate change" examples he presents expresses more about how people read what they expect to appear than anything else; they've got 5% of the way through the articles and made assumptions on the rest of the content based on their experience. Moulding the new into the old. "Seen it before". Easily done. Again, nothing to do with being "confidently wrong".

Re: "I don't know"

Posted: 24 Mar 2009, 08:38
by Adam1
Bandidoz wrote:
Adam1 wrote:Nate Hagens has an interesting article on TOD.
Interesting, but incongruous. He's trying to connect two completely uncorrelated facets:

1) Some people don't like to say, "I don't know", but would rather be "confidently wrong".

2) People have an innate resistance to new ideas that challenge their experience and worldview.

If anything. the two "climate change" examples he presents expresses more about how people read what they expect to appear than anything else; they've got 5% of the way through the articles and made assumptions on the rest of the content based on their experience. Moulding the new into the old. "Seen it before". Easily done. Again, nothing to do with being "confidently wrong".
This is true. I think the 'I don't know' bit is an anecdote but the two ideas are examples of how people feel the need to show others (point 1) or themselves or both (point 2) that they are right.

Posted: 24 Mar 2009, 09:28
by DominicJ
RGR wrote:
DominicJ wrote:The futility of relying on "experts" pretty much nailed.
The next time you need some brain surgery, feel free to hand your significant other a butterknife and "Brain surgery for dummys", let us know how it turns out.
What if Surgery isnt my only option?
A Surgeon will always say let me cut you open, but it may not be the best option.

See?

Posted: 24 Mar 2009, 10:24
by Norfolk In Chance
Quite right... I was advised to have a spinal fusion by a surgeon. I chose not to and was subsequently treated by an Australian Physiotherapist with a simple block of wood!

She said something about surgeons..." When you have a hammer, everything looks like a nail"

My back is now perfectly healthy!

Posted: 24 Mar 2009, 10:55
by Bandidoz
DominicJ wrote:
RGR wrote:The next time you need some brain surgery, feel free to hand your significant other a butterknife...
What if Surgery isnt my only option?
DominicJ 1 - RGR 0 :wink:

Posted: 24 Mar 2009, 13:26
by RGR
[quote="DominicJ"]

Posted: 24 Mar 2009, 13:35
by Bandidoz
A medical consultant who has a broader scope of knowledge and experience, rather than a specialist who has a highly refined narrow scope of skill.

In software engineering the difference is known as "Short-Fat" -vs- "Tall-Thin" people. In electronics you'll have analogue specialists, digital specialists, and "systems engineers".

Capiche?