Intro to Extralogical Reasoning Part 1: The Unwisdom of Common Wisdom
When you see the “extra” in extralogical, you might think it takes logic and simply makes it “superior.” Such a system would better be described as "SUPER-logical." A super-logical system would incorrectly assume that human beings are imperfect logic machines that can be made “better.” Humans have many EXTRA imperfections, and extralogical reasoning provides EXTRA techniques to deal with them. In addition to serving as a guide for avoiding common mistakes, it takes the unyielding principles of logic and science and makes self-aware deviations to actively compensate for flaws in reasoning. The deviations from logic and science, however, are only deviations at face value. Taking all the EXTRA things into account, it’s as logical and scientific as you can get.
The three-part introduction reflects the primary factors that led me to create extralogical reasoning in chronological order: my grievances with common wisdom/thinking and the need for something else; my distaste for standard views on knowledge and learning; and my interpretations of the universal flaws in human thinking, including research in cognitive psychology.
Like most freethinkers, I grew up detesting common “wisdom.” I have not since been converted. But in the past few years, I have, to some extent, been converted to a DIFFERENT perspective.
Although I don’t agree with all societal maxims, I realized the problem with common wisdom is not so much WHAT it says so much as HOW it’s said and what it DOESN’T say—and, therefore, what it IMPLIES.
It’s overly absolute, treating general rules-of-thumb like social laws that take little account for varying circumstances; its lamentably small number of individual parts isn’t systematized, tangible, or consistent; common “thinking” focuses on WHAT decisions to make rather than HOW to make them and, even less on, how to THINK; it’s overly “strength-oriented” and says little about how to manage individual and universal flaws; it doesn’t distinguish between knowledge, understanding, proficiency, and wisdom; nor does it distinguish between truth and practicality or the thinking required for DECISIONS and that of OPINIONS.
Its absoluteness encourages dogmatic and simple-minded thinking in general, and inattention to the others implies they don’t matter.
In fine: The very fact that common thinking/wisdom is so far removed from a systematized and context-based way of thinking strongly suggests that such thinking is unnecessary at best--and at worst a mistake. Thus, my task is laid out for me: convince you otherwise, and show what common thinking lacks.
Crudely speaking, ninety-percent of the human thinking organ (HTO) is primitive, unconscious, and self-inconsistent. I often call it a "chimeric thinking organ": To some extent, you can think of it as a combination of the thinking organs of humanity’s many primitive ancestors with a new small section attached. The newer section, the remaining ten percent, was designed for the purposes of SURVIVAL—not wisdom—in a far simpler environment in a process that followed the path of least resistance, which could only reliably build traits that were “just good enough.” Not only do people have emotions that can greatly impair their thinking, but there are a host of universal cognitive biases and, what I call, “reflexive fallacies”—fallacies people are all but preprogrammed to commit. Although there may be great individual variation in how they manifest themselves, these flaws are universal and only peripherally related to intelligence.
The human thinking organ is designed to oversimplify reality. Inherent to this programming is what I call the “causation bias”: the human tendency to be too quick to assume that causality can be understood and explained in simple and satisfying terms. The bias relates to the "causal defects": the inclination to underestimate (1) the number of factors/variables involved in causation as well as (2) the complexity of their interactions and (3) the abilty of things to organize themselves without conscious intervention--self-organization.
Much of the Universe—from galaxies to clusters of galaxies, from solar systems to planets, from ecosystems to the human brain, from businesses to economies--comes from things organizing themselves. Things evolve, and they can evolve into highly complex things, things that often appear consciously engineered. The factors that affect the modern World are myriad, and many can’t be identified or even ARTICULATED if they could. Society is a complex, ever-evolving system. A complexity theory axiom states that "in a complex system, Man INFLUENCES almost everything but CONTROLS almost nothing."
The dynamics of the weaknesses of the human thinking organ and the complexities of the modern World will be addressed in greater detail in part three; suffice to say, life is not as simple as common wisdom would suggest.
Entirely contrary to what science has shown humanity, the dogmatic nature of common wisdom implies that life is so simple that everyone can go around labeling things (e.g., attributes, abilities, resources) as “good” and “bad” and then straightforwardly base all their thinking and decisions around it. While language can’t function without these words, in the real World, all else is never truly equal; there’s always at least some complicating or detracting factors at play. The all-else-being-equal viewpoint is a thought experiment or learning tool designed to help you understand a general concept, not something to be directly applied to real-life circumstances. The thing known as wisdom is necessary BECAUSE of how much of a simplification the all-being-equal viewpoint is. Yet this is the ONLY viewpoint of common wisdom.
Put another way: No person can possess any attribute, resource, or ability in an isolated universe: No person can possess any attribute, resource, or ability in the absence of other such things (which vary from person to person); and no person can possess any set of these things in the absence of a complex external environment (which varies from time to time and person to person). People have different interests, different strengths and weaknesses, different values, and different responsibilities. Life isn’t simple, static, and predictable; it’s complex, dynamic, and merely guessable. Resultantly, there’s no such thing as a good decision or conclusion without knowing the circumstances for which it was made. Once again, this is precisely the opposite of what common wisdom would have everyone believe.
One of the many flaws of the HTO is that it doesn’t actively distinguish between WHAT it observes and how it INTERPRETS what it observes. Left to its own devices, it blurs observations and conclusions together. Secondly, many people overestimate the importance of being RIGHT and fail to appreciate the importance of AVOIDING BEING WRONG. In other words, wrong answers tend to be more dangerous than right answers are beneficial (at minimum, this is the more pragmatic belief). Avoiding being wrong includes making sure you don't pay a price for lack of answers, which aren't always necessary.
These misconceptions create the illusion that opinions are mandatory, when, in fact, they’re almost always optional. They also make people prone to jump to conclusions. A belief unsupported by understanding isn’t a belief: It’s a dogma, assumption, or, at best, a speculation. Because people are so prone to come to conclusions prematurely and unnecessarily bias themselves and since it's easy to be fooled into overestimating your understanding of something, extralogical reasoning advises people to be selective about what beliefs they choose to have.
Decisions are different. Decisions are mandatory; they carry real-life consequences; and understanding isn’t always possible. There are times in life when you have to make important decisions and have almost nothing to go on; you can only work with what you have—which may be no better than dogmas, gut feelings, cheap rules-of-thumb, and wild guesses. The typical problem with dogmas and rules-of-thumb isn’t so much that they’re inherently “bad” so much as people over-rely on them--and, worse, use them to form completely optional OPINIONS.
If you don’t understand something, consider the three most underappreciated words in the English language: I don’t know. The current democratic voting system is philosophically perverse because it treats having an opinion as more important than knowledge and understanding: They’d rather you vote—thereby, expressing your opinion—knowing next to nothing about societal happenings than know considerably more and say with genuine humility, “I honestly don’t know what’s in the best interest of society” (It’s not that the author doesn’t believe in the right to vote, only that people should have to pass a fact-based civics test first). School is similarly perverse in that it tries to force students to have opinions and/or explanations for things that possibly NO ONE should have or attempt to make. Students are frequently asked, "How does it relate to today?" on topics that involve highly complex systems (more on this in part three) over the course of as much as CENTURIES.
For such assignments, readers are more than welcome to plagiarize the following: "I don't know."
Humans evolved first and foremost for SURVIVAL, not for logical or truth-oriented thinking in itself—it’s just so HAPPENED that practical thinking tended to be truth-oriented. A person with a slavish devotion to absolute truth and logic is in a perpetual conflict of interest that only makes them MORE prone to delusion (there’s a reason it’s called EXTRAlogical reasoning). Since the quality of decisions and conclusions are context-dependent and the HTO is more of a psychological machine than a logical one, your strategies shouldn’t reflect what’s ostensibly most important; they should reflect what requires MOST EMPHASIS.
Pragmatic emphases require belief POLICIES. Extralogical reasoning calls these pragmatic preferences: beliefs policies with limited correctness that are consciously adopted to suit practical purposes when the absolute truth is less useful. Pragmatic preferences are characterized by “pragmatic unwrongness”: neither wrong nor entirely right--but always USEFUL. Extralogical reasoning doesn’t just provide PASSIVE compensatory techniques in the form of references to common mistakes. Pragmatic preferences ACTIVELY compensate for individual and universal weaknesses.
Extralogical reasoning’s axiom that active critical thinking and thinking and decision-making wisdom are more important than knowledge could, to some extent, be considered a pragmatic preference. Because people possess raw intellect in great excess of wisdom, the World has evolved to accommodate, even reward, human unwisdom and unquestioned conformity, lessening what would otherwise be the benefits of wisdom and free-thinking, etc.; but it’s still more USEFUL to treat them as more important. Extralogical reasoning asserts that everyone has the same foundational attributes with greatly varying manifestations. It’s indifferent, factually speaking, to whether people are more similar or different to each other. Since extralogical reasoning emphasizes context-based thinking and the use (and development) of one’s ability to improvise solutions, inferences, and models of reality and deemphasizes prepackaged knowledge, it recommends the pragmatic preference of thinking of all people as more DIFFERENT than similar.
Arguably extralogical reasoning’s most controversial pragmatic preference is the rejection of the idea of "commonsense.” This isn't as difficult as you might think, for commonsense isn't a tangible or well-defined notion to begin with--and the failure of common wisdom to address this is yet another one of its shortcomings. Commonsense is ambiguously associated with common "knowledge," common "wisdom," and "natural" intuition, which themselves are intangible, ambiguous, and unreliable notions (not to mention EVER-CHANGING along with an ever-changing society). No doubt, peoples reasoning instincts are heavily influenced by genetics and many of these are common to most people, but everyone builds their working cognitive foundations at an age before they have any appreciable sense of logic and even less ability to account for their assumptions, which tend to be based on the easily misinterpreted factoids floating around society. Many of the resulting intuitions become entangled with most--if not all--of your instincts forever after.
To whatever extent natural intuition does exist, extralogical reasoning by no means suggests that people dismiss it; on the contrary, to do so would be to dismiss extralogical reasoning itself. The methodology already accounts for these "common intuitions,” and since the system is designed to encourage students to healthily accept and compensate for their inherent flaws, it seeks to eliminate any potential hindrances.
One may ask, if common wisdom is so bad, why have these beliefs caught on? As mentioned, the problem rests more in its IMPLICATIONS, not so much the content. Another problem with common wisdom is it says nothing about what BELIEFS ARE. If people understood the nature of beliefs, they’d know that beliefs have many attractive properties other than correctness. Beliefs that are gratifying, motivating, glamorous, strategic, and complementary to other beliefs are often more robust and easier to find. They evolve in a complex system—society--based on numerous, often conflicting, criteria. This is more than sufficient need for skepticism.
One may also ask, "If common THINKING is so bad, how has humanity been so intellectually successful? Few people appreciate how indispensable social and intellectual supervision and correction are in ensuring the quality of human thinking and understanding.
Science and engineering, humanity's most successful fields, are where social and intellectual supervision and correction are highest. In other words, they're socially-validated and well-defined intellectual areas with lots of social structure and competition, which people tend to hopelessly over-rely upon for discipline and motivation, and results are easily measured. This, along with the fact that one's psychology is less directly involved, guides practitioners' thinking while keeping their cognitive weaknesses in relative check. In life-related thinking, on the other hand, supervision and correction are far lower--and where people's thinking and understanding are far worse. This will be discussed further in part three.
The most ironic problem with common wisdom is that it sabotages itself. Its implications and resulting associations can cloud the fact that dogmas and societal maxims have merit—if only people sufficiently recognized and treated them for what they are. As my friend (and co-blogger) Q says, dogmas often exist not so much because people know what they’re doing—but because they DON’T. If properly arrived upon, good dogmas are the result of what extralogical reasoning calls informed ignorance and intelligent humility. The dogmas are not considered absolute IN FACT, but their emphases serve USEFUL PURPOSES. Their originators know what works WELL ENOUGH or what is the MOST LIKELY thing to work best.
While society’s pool of beliefs isn’t nearly as self-corrective as science, the fact that a belief caught on necessarily means SOMETHING. There’s often a wide gap between everything and nothing. When people exaggerate something too much in one direction, there’s often an overreaction in the other. IQ tests, the NFL combine, preseason football, and the popularity of beliefs are great examples of things that are INFORMATIVE without being DETERMINANT. They are very useful, but none mean close to everything. Anything can be misused or over-relied upon. When people encounter failure with the ostensible use of certain methods, critics are too quick to blame the methods themselves without taking a proper look at HOW THEY WERE USED. These are easy traps to fall into, and my relationship with common wisdom has helped me learn how to avoid them.
But there's still another problem with common wisdom I haven’t expounded: Its failure to distinguish between knowledge, understanding, proficiency, and wisdom. One of common wisdom's most dreadful cliches is, "Knowledge is power." Once again, not really wrong but highly misleading. This is the subject of the next part of the intro: "Knowledge (alone) isn't Power."
Comments
For sure I get the point you are making about conscious decisions, reason and evidence: mostly we avoid them and use reason and cherry-picked evidence after the fact to justify decisions we make subconsciously. But I think you miss the point that this is actually an optimization: for a vast majority of decisions, this kind of social, heuristic decision-making is not just better but absolutely necessary. The number of decisions we make in a day is staggering! Auto-pilot is essential in order to protect limited, intellectual resources so we can allocate them to making more important and novel decisions. We're tribal, social creatures, after all. For most decisions, we care a lot about how other people perceive our choices and behavior; post-facto, social reasoning is not only adequate but totally merited. Of course we allocate substantial cognitive resources to power our social performance; acceptance and belonging is absolutely a matter of survival.
The central questions ought to be: How do we decide which decisions we want to make consciously and which we prefer to leave to the subconscious? And how much time, effort and reason will we allocate to social performance? And, as you know and demonstrate all the time, how much we are willing to work on understanding our own thinking? These meta-questions are also a matter of survival.
Bravo.
I always thought that ex post facto thinking was a rationalizing thing--but maybe that's at least a part of what you're talking about. To simplify, people aren't so much designed to think or make decisions rationally so much as to CONVINCE themselves and other people they're thinking and making decisions rationally. Similarly, also to simplify a little bit, people aren't so much designed to understand the world around them so much as to CONVINCE themselves and others they understand it. The innate tendency to rationalize is strong enough that even in the absence of empathy, as in the case of sociopaths, or obvious social objectives, people will still look to rationalize. Avoiding confusion and maintaining communication and "social continuity" in the pack are probably quite useful--even if not entirely rational or dignified.
Whatever amount of time people should put into their thinking isn't something I've thought about much--only because whatever the right amount is, it's a hell of a lot more than most people put into it. Given how things are, it almost can't be encouraged enough. And while I'm not necessarily saying bonfied reasoners should harangue people about their thinking and excessive conformity as Bish did, I still think people are enough off in this way, as well, that one not need worry too much over-encouraging it. After all, central to extralogical reasoning is that policies you use regarding how you manage your life shoudld focus on what requires MOST EMPHASIS, not what's ostensibly most important.