Appropriate Truths

Warning. This post is most assuredly not politically correct.

In “Origin of the Specious”, the late Irving Kristol quipped:

There are different kinds of truths for different kinds of people,….there are truths appropriate for children; truths that are appropriate for students; truths that are appropriate for educated adults; and truths that are appropriate for highly educated adults, and the notion that there should be one set of truths available to everyone is a modern democratic fallacy. It doesn’t work.

Cynical, patronizing, and arguably elitist, but not necessarily unreasonable.

We encounter different truths growing up, ranging from the tooth fairy and Santa Claus, to anthropocentrism, mind-body dualism, an eternal soul, a personal god. As our understanding of the world increases, some truths are readily discarded, some are replaced by more sophisticated versions, and some persist (of course, the “truths” discussed here are not literal).

Some truths, such as Santa Claus, seem to be discarded as one gains basic understanding of the world. Or is it so? I think that most trivial ideas are rejected not due to critical thought, but because their peers reject it. For children, the peer group is typically their classmates or friends, a group within which ideas spread and proliferate, and exerts immense influence. After all, migrant children tend to speak the language and identify with the culture of their peers, not their parents; and kids attending international school tend to come out not with their parent’s accent but a mixed accent of their peers. I posit that most children reject ideas like Santa Claus primarily because their peers reject it, with the rationalization coming later and secondary. For adults, although the peer groups may differ based on the subject, most people still follow the prevailing position of each in-group, treat the opinions or arguments presented by the in-group as more meritorious than they deserve, and ignore or discount disconfirming evidence.

Independent thinkers (relatively speaking of course), far fewer in numbers, may possess the thinking tools, but lack the knowledge, capacity, or even willingness to examine an idea properly and critically.

For example, to debunk Santa Claus, one does not need to understand mammalian aerodynamics or solve the Traveling Salesman Problem; simple, intuitive (Bayesian) probability will do. Reindeer have not been known to fly, and elves have not been known to exist. The prior probability of either is negligible (let alone both), therefore the idea can be safely rejected. Unfortunately, many “truths” or ideas, especially those involving ideology and theology, are not as easily revised or dismissed, and usually require scientific literacy, advanced logic and thinking tools, philosophical constructs, an understanding of psychology and cognitive biases, intellectual capacity (so politically incorrect!), honesty and curiosity. A few examples include intelligent design vs. evolution, global warming, alternative medicine, mind-body dualism, tabula rasa, and rational choice.

On important issues, people are often adamantly aligned with what they feel is right, apparently in itself a reason enough. It is often useful to see what a collectively disinterested group of experts think about a subject, without appealing to authority, . For example, consider this poll conducted by Pew Research (full report). It is unreasonable to expect a layperson, such as myself, to be experts on these important issues; however I believe that one should have the basic humility to at least seriously consider the views of expert scientists, who are collectively far more intelligent, and understand the issues and nuances more thoroughly. Being politically incorrect, I believe that scientific (not philosophical) issues should not be a democracy; they should be guided by relevant experts rather than popular vote. Could the experts be wrong? Could scientists fall victim to groupthink? That is a valid question, as no one person (or group) is right all the time. A more relevant question would be, who is more likely to be right? The gap of understanding between scientists and the general public is great. Case in point: when the scientists are asked “how much of a problem that the public does not know much about science”, 98 out of 99 answered as a major or minor problem, only 1 answered “not a problem”. As Adam Savage would say, “well, there’s your problem”. One the one hand, you have a consensus reached by the smartest people who dedicate their lives studying it; on the other, a gut feeling.  Assuming they differ, which one would you bet on?

Could the approach to a higher “truth” be like mountain climbing, requiring specialized tools and skills? Perhaps, like Andrew Wiles’ proof of Fermat’s Last Theorem, requiring a deep understanding of disparate fields of mathematics, with few having the wherewithal to even understand the proof given? For a rigorous examination of certain issues it may be true (some philosophical problems for example).  However I suspect that even though many issues are complex and often intentionally confusing (example), given the right tools, most people can reasonably reach higher levels of “truths”.

One of those tools is a basic understanding of how the brain works (or fails to work). The ever growing list of cognitive biases discovered by science does not paint a pretty picture. The brain is but a delusion generator, constructing a version of reality from various input signals, eloquently explained in How the Mind Works. One needs to look no further than split brain research to realize this. A recent salient example is Dressgate, which made people question the veridicality of what they see with their own eyes. Evolutionary psychology shows that this constructed version has little to do with reality, and more to do with what had conferred a survival advantage, with heuristics and approximations often being good enough. To cut through the brain’s deception requires thinking critically and scientifically, in itself an arduous and painful process, as sacred cows are slain and comforting beliefs crumble under closer examination.  As the saying goes, the will to doubt requires far more than the will to believe.

The cynic in me asserts that few would even bother with the process, much less endure the unending cognitive dissonance; the optimist in me asserts that, well, false hope is still hope.

I believe that most can attain a higher “truth”, subject to our practical limitations (bounded rationality). It is probably beneficial to most people individually and collectively as a society; however, whether it is always beneficial to everyone is a question I cannot answer. After all, the curse of knowledge is a real phenomenon – look no further than at my jargon-filled, needlessly abstracted, diabolical writing style, unintelligible to most, often including myself.  Deeper understanding does not necessarily result in greater happiness.

Bertrand Russell once said “The fundamental cause of trouble in the world today is that the stupid are cocksure while the intelligent are full of doubt”.  The intelligent may have a better understanding, but the curse is that they lose the perspective of those less informed.  It is as easy to unthink a solution as unseeing a hidden message, or unfinding Waldo.  Sadly, in the current climate, it is politically correct to “give equal representation” to the fervent and passionate Waldo deniers, metaphorically speaking.  After all, the cocksure are loud, but more importantly, they can vote.