Neal's Puzzle of Life, A Mini-book: The First Conceit: Pragmatic Unwrongness (25 pages)
The
following is the first of three “conceits,” or personal conceptualizations, of
Neal’s puzzle of life that collect to a single mini-book. It begins with a few
page intro, followed by the first (twenty-page) conceit. The others are
forthcoming.
Intro
to the Puzzle of Life
Only
in hindsight do analyses follow logical progressions. In real time, the answers
often arrive before being explicitly asked. Partly due to finite intelligence,
partly by design, the unconscious has limited communicative capacity, hoarding
a treasure trove of questions and answers.
If
one’s imagination isn’t restricted to readily available frames of
reference--and they can comprehend the nuances between how things are and how
they could or should be--it adds variables to an already complex puzzle. True
free thinkers aren’t just inclined to doubt the answers to the popular
questions; they seek new questions. But questions don’t guarantee answers, and
seeking questions doesn’t ensure you’ll find them. Having way more questions
than answers and unable to build satisfactory models of the World, confusion
inevitably ensues. Unidentified questions obscure the existence of the
confusion, exacerbating it.
An
expanded imagination creates tremendous opportunity for learning, but it hardly
makes life simpler and easier. Victimized by the latter, the author was
compelled to exploit the former. The result was what became known as extralogical
reasoning (ER) and his three “conceits” of the
puzzle of life.
From
a certain point of view, a person’s point of view is that person. It dominates
their perceptions and decisions, making their views relevant, however
misguided. Following a similar line of reasoning, a person’s solution to the
puzzle of life is the puzzle of life, and the puzzle is that
person, even if never wittingly discovered. If a person ever solved the puzzle,
the puzzle would change, making it fundamentally unsolvable. “Solving” the
puzzle means iterating progressively better solutions as the puzzle continually
changes. The puzzle of life, as you can see, can be viewed from different
viewpoints. Each conceit, as I call it, is one of three major viewpoints.
Like
everyone else’s, my solutions are continually changing, even as I make this
presentation. If you followed my blog, you could track its evolution over the
past few years. A few years from now, maybe my conceits will be different;
maybe there will be a fourth. No better at solving puzzles than the average
nine-year-old, it’s not a term I would have picked. It was my mentor’s, Steve
“Bish” Bishko, 1946-2015. He founded his own reasoning system to solve his
puzzle of life called life engineering, the precursor to
extralogical reasoning. As my mentor, Bish is probably my most regular muse. I
often imagine going back in time to present the conceits to
him.
The
first conceit is the most practical and accessible, based on “Unwrongness.”
Unwrongness emphasizes avoiding, recognizing, and learning from wrongness;
suspending judgment; holistic management of one’s beliefs; and intelligent
skepticism and humility. As I’ll show in all three conceits, management of
beliefs is the same thing as life learning and managing one’s psychology.
The
second conceit digs deeper into mathematical and scientific reasoning. It
presents the sentient duality as a justification for Unwrongness.
Self-awareness is not merely inevitability fallible; it and self-delusion are a
consubstantial duality, two of the same essence. I explain that even in the
absence of cognitive and emotional bias, a thinking apparatus must still have
logical, epistemic, and “self-referential” bias. And if it carries the
inescapable biases of a product of evolution, self-delusion will predominate.
Self-understanding is not the absence of self-delusion, but sufficient
awareness and management of it—through Unwrongness.
The
third conceit will present my philosophy of Nature based on Complexity or
Complex systems theory, which extends to human psychologies, the puzzle of
life, and the modern world. And this will lead, yet again, to Unwrongness.
The
First Conceit: Unwrongness
A
lot of bad thinking results from confusing related things:
Thinking
for opinions (which are usually optional and should enhance
learning) vs decisions (mandatory and not directly connected
to learning); correctness vs practicality; truth vs. beliefs (i.e.,
criteria/needs for beliefs other than truth); intelligence vs logicality;
judgment vs competence; wisdom vs raw intellect; knowledge vs understanding;
skepticism vs open-mindedness (which are the same thing); intellectual
authorities as a means to learning vs ends in themselves; and
preconceived ideas (dogmas, rules-of thumb, etc.) intended to eliminate the
need for context-based thinking /vs preconceived ideas to compensate for
the inability to exercise judgment when lacking information and/or
knowledge.
Even
if you don’t entirely agree with the forthcoming analysis of the above, if you
can explain the nuances between them, you’re well on the way to “solving” the
POL. Had I been able to do so before seven or eight years ago, the first forty
years of my life might have been a lot easier. This conceit will attempt to
reconcile these confusions.
What
most distinguishes a true solution to the POL is a foundational epistemology,
or system of thinking and decision-making methods for application to context.
Epistemology is the branch of philosophy that studies the dynamics of the
thinking, decision-making, and learning. Sometimes people call it the study of
how you know what you know. Epistemologists, as I’m now defining them, are
those who favor context-based methods and rely as little as feasible on
attachment to specific ideas, decisions, and conclusions. Dogmatists are
defined as the opposite. Because most people don’t believe in epistemology or
attempt to articulate epistemic methods, few truly discover the puzzle of life.
Confusion
arises because no one can rely on just one. Methods can only be systematized so
much without telling people what to believe and do, and no set of theories and
principals could fully prepare you for every situation. Life and Nature
wouldn’t be what they are without variables in a perpetual state of change.
Sometimes, as mentioned a few paragraphs above, one must rely on preconceived
ideas when context-based thinking isn’t practical, and this is confused with
attempts to eliminate the need for context-based methods. Many
freethinkers don’t like dogmas, but, often reinforced by ego, they forget that
you don’t have to agree with advice to take it (and can and should remain
skeptical).
There
are many reasons to be skeptical of human judgment, both your own and others.
The exhaustive research of Daniel Kahnemen and Amos Tversky, John Haidt,
Phillip Tetlock, and complexity theory have shown this--much of which is
examined in the works of Duncan J. Watts, Nassim Taleb, and Micheal Lewis,
author of the famous Moneyball. Bias exists in many forms other
than emotional. Skepticism of both oneself and others is an indispensable part
of life learning, which goes well beyond knowing the answers. At the same time,
this self-skepticism seems to challenge the prudence of context-based
decision-making.
A
question that must be asked is: “How to manage self-skepticism with life
learning and context-based methods?” The answer is Unwrongness.
Unwrongness
The
core of Unwrongness is suspending judgment and decisions, whenever practical.
It may be obvious that suspending judgment is the best means of avoiding being
wrong, but the ease with which one can be wrong is apparently less so. Even
less is that wrong beliefs lead to more wrong beliefs. Least obvious is that
suspending judgment is also the best means of being correct even if the overall
conclusions are the same. Knowing means knowing the answers; understanding
means knowing how the facts that support them fit in with each other. If you
suspend judgment, you’ll collect more supporting facts and will have a better
chance of knowing how they fit together because analyses that suspend judgment
are purer and less circular than ones that come to conclusions prematurely.
Avoiding
wrong beliefs is especially important for the mentally ill. Emotional
states—depression, anxiety, and hyper-mania—are when people most want to make
decisions and forms conclusions when it’s also the worst time to do so. This is
not just because one’s thinking is impaired; it’s when beliefs most imprint on
the subconscious, corrupting the victim’s thinking. The key to minimizing
corruption is a general habit of suspending judgment and decisions to prepare
oneself for these times, which curbs corruption and impulsive
decision-making.
Unwrongness
treats wrongness as more dangerous than correctness beneficial and necessary.
Even if it’s done mostly subconsciously and if they’re riddled with
contradictions and hypocrisies, everyone has a cognitive and emotional reflex
to reinforce their beliefs in mutually supporting packages. It should,
therefore, be assumed that the more a belief or model of reality deviates from
the truth, the more the beliefs and models that will come to support it will
also deviate from it, including future beliefs and models, which can only be
guessed. Wrongness is a positive feedback loop—wrongness leads to more
wrongness. The cognitive and emotional systems also reinforce behavior, goals,
decisions, plans, and the language of models of reality with beliefs.
The
reinforcement of beliefs is strong enough that even when the subtlest elements
of corrupted thinking are recognized and tossed out, their supporting beliefs
can linger indefinitely. This makes it easy to mistake progress for cure and/or
improvement for competence. Extralogical reasoning calls these vestigial
beliefs, and the subconscious is rife with them. During emotional states,
harmful beliefs that are rejected by the conscious can linger as latent
beliefs, which might be accepted in one form or another following
subsequent emotional states (many latent beliefs are also vestigial). Due to
the cognitive and emotional systems relentless belief reinforcement, ER advises
against believing in cures and guarantees: The former blinds you to vestigials;
the latter cultivates the wrong attitudes and beliefs.
Resulting
from the same cognitive mechanisms, goals, plans, and correctness aren’t as
necessary or beneficial as thought to begin with. Everyone suffers from
the causation bias, the tendency to automatically assume an
ascertainable and satisfying relationship between cause and effect. The
tendency to jump to conclusions has been shown to be more than just a mistake:
It’s what the thinking organ does by default. Combined with the
social and emotional benefits of beliefs, this makes an opinion on any given issue
seem much more necessary than the case. Nor is correctness as informative as
believed. Correctness doesn’t necessarily tell you what requires most emphasis:
You control some factors more than others; certain mistakes, especially in
prioritizing, are more likely than others; and people have different strengths
and weaknesses, both in resources and abilities.
Reinforced
by social and emotional factors, premature goal-setting and planning aren’t as
beneficial as thought, either. The setting of specific goals is more of
an effect of motivation than cause. In other
words, yes, many motivated people do set specific, official goals, but they set
goals mostly because they’re motivated; little if any of the
motivation comes from the goals themselves. Confusion between cause and effect
is a common example of confusion between correlation and causation, one of the
commonest fallacies.
Intelligent
Skepticism and Humility and Learning
As you might have deduced, Unwrongness, however intuitive, is not an entirely natural way of thinking. People are cognitively, emotionally, and socially programmed for opinions more than logic and truth. Epistemic beliefs, especially if coupled with ER Deism, is a viable alternative for those with the right proclivities. If you put your faith in your beliefs about beliefs, your knowledge of knowledge, your understanding of understanding, etc., it becomes easier to be objective about everything else. Among other things, full adherence to ER is not required to benefit from it, especially for mental health management. And Unwrongness can be incredibly empowering.
True
Unwrongness must be supported by intelligent humility. Intelligent humility and
confidence are one in the same. Confidence is not thinking and acting like
you’re great at everything; it’s about healthy acceptance,
including of one’s weaknesses and limitations, even if they’re merely human
universals. Confidence without genuine humility isn’t confidence; it’s hubris,
which is more dangerous than ignorance.
Intelligent
humility requires intelligent skepticism. Intelligent skepticism is
open-mindedness, and open-mindedness is mutual skepticism with
a desire to learn. You shouldn’t just be skeptical of other people because
they’re flawed and limited; you should also be skeptical of yourself because
you, in one way or degree or another, have those same flaws and limitations,
including a limited ability to understand things based on limited information.
Skepticism without self-skepticism isn’t skepticism; it’s arrogance. Skepticism
without skepticism of others isn’t, either; it’s meekness and/or intellectual
apathy.
And
one must want to learn. Learning has nothing intrinsically to do with people.
People are simply a means to an end. God and Nature don’t share their monopoly
on truth with mortals. And learning is about a lot more than truth—but ideas,
questions, explanations, understanding, proficiency, and wisdom (and counter explanations,
questions, data, etc.; don’t underestimate the catalyzing effects of refutation
and debate). This doesn’t come from worrying who says what, what their
credentials are, or taking people’s words for things--nor blind contrarianism
or dismissing other people’s ideas. It comes from intelligent
skepticism.
Intelligence’s
Role in Judgment—and lack thereof—and the Root of Epistemic Weaknesses
Many
believe raw intellect is their judgment’s greatest asset. No doubt, this can be
leveraged to one’s advantage, and this is largely what ER is. But raw intellect
is more of an advantage—and especially guaranteed advantage—in matters of competence or proficiency.
Judgment, decision-making under uncertainty, is different; judgment takes a bit
more maturity and pragmatism. Above all, it takes the honed, self-conditioned discipline provided
by Unwrongness. To exercise effective judgment, one must understand how
easily their intelligence and knowledge can be nullified by cognitive and
emotional bias—and learn to act accordingly. And there is no one more
biased than those who think they’re impervious to it.
Emotional
and social bias are profound, but they’re more reinforcers than root causes. ER
asserts human bias is rooted in the cognitive system. The need for coherent or Resonant models
of reality led to the evolution of highly effective but no less flawed
cognitive mechanisms. Collectively, I refer to them as Artificial
Resonance. To avoid a perceived reality based on disjointed thoughts and
images, Resonance doctors subconscious and sensory inputs to create the
coherent experience people call reality. Resonance works by
connecting related events and twisting them into causational-like
relationships, creating a perceived reality based on a “harmonious narrative.”
98
percent of the time, the coherent or “Resonant” thoughts and perceptions cognition
artificially creates are exactly what’s needed. It’s both normative and
descriptive. “Automative” or “autopilot-like” thinking remains critical to
human function. But in the two percent, during more calculated and “reflective”
thinking, it’s only descriptive. Artificial Resonance reinforces
wrongness and encourages fallacious reasoning. A fallacy generally occurs when
someone establishes a connection between variables and makes illogical
simplifying assumptions about their relationship. This is exactly what
Resonance does, making it the origin of most fallacious reasoning. Resonance is
additionally responsible for:
The
overestimation of the ease and necessity of correctness; the tendency to blur
observation and conclusions, causing one to leap to the latter; the
availability heuristic, or underestimation of the potential significance of
unknown information; the tendency to be almost infinitely more inclined to
confirm one’s beliefs than question them (which is not the same thing as
questioning other people’s beliefs; see also confirmation
bias); most generally, causation-fixated reasoning that leads to confusing
correlation-causation and the resulting ineptitude at probability and
statistics (the study of correlation, highly relevant to the modern
World); and much more.
It
doesn’t stop there. Working ex-post facto, Resonance rationalizes, reinforces,
and/or simplifies prior conclusions and events, imparting
hindsight with artificial clarity and perpetuating the metastasis of
wrongness.
It
is only here where social and emotional factors come in, tremendously
reinforcing them.
The
intellect is not independent of the older and collectively more influential
cognitive and emotional systems. They’re an entangled unit unpredictably
different and more powerful than the sum of its parts. I call this the emergent
system. The influences of the three nonintellectual systems are tragically
underappreciated—thanks to the inevitable programming of all four. Since the
thinking organ’s tangible components are mostly subconscious and the overall
psychology is dominated by an emergent system, this makes the most powerful
components the least understood and most known component, the conscious, the
least powerful. The thinking apparatus is, therefore, forced to put (otherwise)
undue stock in conscious beliefs in its attempts to model itself. This is one
of many reasons a self-aware thinking apparatus, engineered or evolutionary,
will always possess inherent self-delusion, not merely fallible self-awareness.
As discussed in Johnathon Haidt’s best-selling The Rational Mind and
in the second conceit, no one is designed for rationality and truth; they’re
designed for simulations known as rationalization and beliefs.
The need for comforting, motivating, coherent, and socially strategic models of
reality places tremendous constraints upon people’s beliefs, behavior, and
perspectives. To reach one’s potential, one must gain self-understanding
through awareness of their delusional nature—and act accordingly.
If
the intellect doesn’t manage the effects of other components, it will operate
as if independent of them. But it’s not because it is independent, of course,
far from it. It’s just that the intellect will manifest in areas where the
others have comparatively little involvement, then in other areas, the
intellect gets shut out, leading to distinct manifestations. Too much emotion
can cause problems, but too little can, as well. The intellect evolved to
enhance the more primitive cognition that was already there, not as a
completely self-contained entity (note that cognition is a
broader term then cognitive, which usually refers to the cognitive
system). When operating optimally, one treats them as a unit, always skeptical
but never ignoring their intuitions and emotions. Managing the biasing effects
of the cognitive system requires greater study than the emotions because the
system functions almost completely
unconsciously.
Personal
experience and well-known research in cognitive psychology and sociology have
shown that raw intellect bares little correlation with the magnitude of these
biases, which manifest on a spectrum (see works of Nassim Taleb and Phillip
Tetlock and Everything is obvious--once you know the answer: why
commonsense fails us by Duncan J. Watts). If both intelligence and
bias are high, a person will tend to be more logical than intelligent (this
might even be the case if both bias and intelligence are comparatively low,
albeit much less so). Testing and correction, via training and/or experience,
are the best means of keeping bias in check. It is when one lacks information
and/or a means of rigorously testing and correcting their understanding, when
one is practicing judgment, that these biases really come into
play.
Obliviousness
to the cognitive, emotional, and emergent systems is demonstrated by how
laughably overrated the predictive powers of conscious beliefs are; that is,
how well the intellect can predict their behavior, perspectives, and other
conscious beliefs. Many act like the right intentions and knowledge, and
sometimes just one, all but guarantee appropriate action. Even the best of
intentions means little without an intelligent plan and the intelligent
execution of that plan. It’s wise to believe that a piece information is only
as useful as your understanding of how to use it. In fact, especially in
matters of judgment, overestimation of know-how is usually more dangerous than
having the minimal know-how.
The
increased need for self-conditioning and self-imposed correction is
the result of the lack of external conditioning and correction
of the relevant area. Self-correction/correcting/corrective is a term in
philosophy and science that refers to an activity or pool of beliefs’ ability
to correct itself over time. It’s measured by the domain and its principals’
testability and trainability/teachability; incentives (usually social and
monetary); mix of cooperation and competition; and purity/clarity of objectives,
e.g., some fields have marketing, legal, and bureaucratic constraints. Math,
science, and engineering are highly self-corrective. The mental health
field--being an inexact science laden with legal, university curricula,
bureaucratic, and marketing constraints—has much less (don’t pretend their
patients aren’t clients). Life and society’s pool of beliefs have less
still.
Self-correction
in science and engineering has allowed them to become humanity’s most
successful fields. The rigors of scientific testing don’t just make it possible
to prove ideas correct, but to disprove wrong ones. They triumphed in their
pursuit of truth by exposing falsity and adjusting accordingly, often despite
great individual opposition. As I've said, don’t let the survivors-biased
nature of popular history fool you: Science rose to glory on the backs of its
members’ failures far more than by standing on the shoulders of giants.
Resultantly, Aero-astro engineering can put unmanned rockets into orbit around
Mars; quantum mechanics can predict the magnetic moment of an electron to one
part in a trillion; cell phones and blue-tooth allow people to talk to someone
half-way around the world while jogging; and automobile manufacturers pump out
dozens of cars day after day.
The
“easier” social sciences are much less self-corrective, explaining lesser
success at treating mental illness and predicting wars and the economy.
Psychologist Phillip Tetlock, for example, gathered predictions from one
hundred political pundits over a twenty-year period (1984-2004) and found the
pundits were not only outperformed by the simplest statistical methods, but
barely did better than random guesses. Chronicled in the events of Micheal’s
Lewis’s award-winning Moneyball, Billy Bean, former GM of the A’s,
made the best recruiting effort in MLB history relying solely on statistics.
The social sciences and pedestrian endeavors with less self-correction are
“easier” largely because of the relative lack of rigorous testing, which
readily allows for an exaggerated sense of understanding. Putting it crudely,
there’s more room for bullshit.
Personality
and personal orientations are also relevant. Yogi Barra once said that “In
theory, theory and practice are the same, but in practice, they are different.”
A central part of thinking, decision-making, problem-solving, learning, etc.
comes down to understanding what’s importance and what isn’t and their relative
importances. Often due to nature and/or training, the perspectives of highly
intelligent people are more likely to be skewed toward the abstract and
theoretical. This is all well and good if you want to be a physicist--but not
necessarily if you want to be a pragmatist, manager, or leader (if such a thing
is more than nominal).
One’s
sense of importance is arguably more influenced by personality and conditioning
than straightforward teaching (note conditioning and teaching are different,
despite obvious overlap). If you treat something with an artificial sense of
importance, you cultivate an artificial sense of importance.
Heavily immersing in theory, especially the wrong type, skews one’s sense of
importance accordingly.
Intelligent
Skepticism
Consistent
with this conceit’s thesis, part of having good judgment is knowing its
limitations—and your own. The less you rely on direct evidence, data, and
tested knowledge and solutions, the more you rely on judgment; and the more you
rely on judgment, the more vulnerable you are. It’s tricky business. On the one
hand, ER advocates context-based thinking and decision-making; on the other, it
preaches the unreliability of judgment. For all it’s a priori theorizing, ER
still believes in the scientific method and that direct evidence is more
reliable than indirect evidence and expert opinion. An example of indirect
evidence is using someone’s qualitative reasoning skills to assess talent for
physics; direct testing evaluates how well someone does learning physics.
Philosophers—and judgment itself--find niches because not everything can be/has
been tested and experimentalists and empiricists need theories to guide
testing.
It
takes skill and discipline to engage in context-based thinking without unduly
relying on judgment. The key to good judgment is understanding the
contrasts between independent and dependent thinking and finding the right
balance.
Dogmatists
give dogmatism a bad name. It isn’t so much that dogmatism is foolish; it’s
dogmatists that are. When people reach failure with the ostensible
use of certain methods, their critics are too quick to blame the methods
without taking sufficient look at how they were used. Something is
only useful for what it is—not more, less, or something else.
Reality is often a poor model for how things could or should be. Save ignoring
context, the biggest mistake dogmatists make is failing to be dogmatic. They think
their relying on sufficiently tested ideas, well-trained experts, and expert
consensus but don’t understand testing, training, and expertise generally. This
is unfortunate—because neither do the “experts.”
The
first thing one must do to follow expert advice is find the right field, not
merely one they think is similar. Not only does history show
that experts are often wrong, but barely do any better than the layperson
outside their specific area of training. For example, many think bodybuilders
and powerlifters are necessarily experts on athletically-oriented strength
training. The “sports” are completely different, and absolutely none of it’s
athletic. But as expected, when the advice of bodybuilders and powerlifters is
solicited, four out of five times the “experts” prove as oblivious as those who
depend on them (this may be less the case than it once was).
Next,
don’t merely solicit the opinion of one expert. Expert consensus is hardly
assured. They frequently disagree, and because competition catalyzes learning,
a field’s self-correction and long-term progress depend upon it. Society’s pool
of beliefs is laughably inconsistent, and nuanced elements of Resonance
obfuscate just how much.
Because
judgment isn’t as reliable as competence, neither is expertise from one field
to the next. Self-correction among fields and pool of beliefs exist on a wide
spectrum. None have zero; none have one hundred percent. Practiced does
not necessarily mean tested, or equally tested. Some things aren’t
as easily tested; for example, psychology is a much less exact science than
classical physics. Skills in some fields are more trainable or teachable than
others. If you’re smart enough, you can be trained to solve math, physics, and
engineering problems (even if not to make novel discoveries). This is less the
case for mental health professionals, who rely more on judgment (see article
on pychiatric diagnostics).
Some areas, such as sports, have problems with controlled testing—testing
methods in isolation to eliminate obfuscating factors. Sports guarantees
winners regardless of the quality of competition and their methods, allowing
wrong beliefs to persist. Sometimes, only packages of methods
are/were tested. They aren’t/weren’t controlled, and seeing how it fits with
other packages may be infeasible. Some areas/fields like personal training and
the mental health field are impurified by society’s pool of beliefs and
marketing, legal, bureaucratic, and university constraints.
University
constraints should not be ignored. Ideally, training in the mental health
field, where judgment is so crucial, should be based on wisdom first and
knowledge second; but constraints, along with the greater teachability of
knowledge, ensure the opposite. This isn’t just a problem because of a lack of
the right training; its absence conditions the wrong
perspectives and working beliefs. As said, one must understand the relative
importance of knowledge. Thanks to Resonance, if you treat something with an
artificial sense of importance, you cultivate an artificial
sense of importance. Because everything must be done in university, psychology
and exercise science majors, for example, spend way too much time studying
non-action-worthy theory that skews their sense of
importance.
One
must research and question methods and principals, especially if they want to
learn. This includes questioning the experts directly, which doesn’t have to be
oppositional. On the contrary, it should be educational—both for you and the
expert. Remember, you can’t be open-minded without being skeptical and vis
versa. This essay focuses on decision-making, but you often can’t separate
them, especially since the experts themselves haven’t done enough learning. If
you ask questions the right way, the experts might even thank you for
it.
Finally,
don’t just evaluate the field, but the specific principals. They should ideally
be both testable and have been tested. Falsifiability is
best explained by defining its antonym. A non-falsifiable theory is one that
can at best be proven right, never wrong. If a theory is falsifiable, there are
hypothetical tests that could disprove it, but if the theory
is valid, these tests fail, reinforcing it (this doesn’t mean it’s necessarily
proven). The longer a theory persists, the better the chances it’s valid. At
the same time, lingering hypotheses left unchallenged often harden into “facts”
without adequate testing, reminding you that time is merely a heuristic.
But
even if you choose to take expert advice, you shouldn’t believe it unless you
understand it. Again, an opinion is probably optional. A consequence of
Resonance is that it will try to turn your models of reality into (what you
perceive as) reality itself, making you vulnerable to needless wrongness.
But
context-based thinking free of experts must still be understood. Nothing in
life exists in an isolated universe: Beliefs, decisions, goals, interests,
attributes, abilities, pieces of knowledge, etc.—all interact with themselves
and each other in complex and dynamic ways. Life and Nature wouldn’t be what
they are otherwise. The variables are always different, and how two variables
interact is often influenced by the other variables they interact with.
Moreover, being inexact sciences with a host of criteria and often little
supervision and direct incentives, most life-related areas involve low
self-correcting pools of beliefs, including society’s and those of the mental
health field.
Nevertheless,
those with the worst judgment, however, are those who rely on it most, people
who can’t be bothered with testing, direct evidence, and the possibility there
might be important information they don’t have. Hubris is more dangerous than
ignorance; overestimation of know-how tends to be more dangerous than the
minimal know-how, especially in matters of judgment. Many may think they’re
engaging in context-based thinking yet base context only on what they already
know and their accompanying assumptions (or what they think they
already know). Knowing facts doesn’t mean you know how they fit together;
knowing many facts doesn’t necessarily mean you don’t have many, if not more,
false assumptions. No one with good judgment readily turns one fact into
several. Sherlock Holmes, operating in an area that was half competence besides,
was known for turning one fact into a few more, not many.
Interpolations and, especially, extrapolations are more often fallacious than
not.
For
these reasons, information without testing is potentially dangerous.
A
Summary of Thinking and Decision-making Steps in Judgment
Since
this conceit’s an overview of Unwrongness, I will merely summarize ER’s
principles of judgment below, leaving exposition for upcoming articles and the
eventual book.
The
first principle of judgment is Suspend judgement as much as practical. The
second, Don’t get burned by what you don’t know; i.e., don’t fail
to acquire attainable information; account for what you can’t feasibly attain. Third, Be
wary of things you think you know that haven’t been adequately tested.
Fourth, Don’t get burned by personal flaws. Five: Apply
logic.
In
extralogical reasoning, the extra comes before logic.
Recognizing
and Learning from Wrongness
One
learns, or truly learns, when they find out they have flaws and/or
incompleteness’ in their understanding of something and adjust accordingly.
This first means learning the specific lesson, then integrating the lesson into
your general understanding of the relevant topics. Thus, one turns knowledge
into understanding. The second requires recognizing symptoms and connecting
them to “disease,” so to speak. If done well, this will likely include epistemic learning—i.e.,
what mistakes did I make in my thinking; what can I do to prevent future
mistakes; what fundamental misconceptions might this relate to? The
“reflex” to connect symptoms to disease is the symptom reflex, and
honing it is the key to successful life learning.
Lacking
this reflex is one of the defining symptoms of insanity, which, in turn, is
almost always accompanied by an inability to turn intellectual into working
beliefs (the lack of symptom reflex covers the clichés of repeatedly doing the
same thing after repeatedly getting undesirable results and/or not learning
from mistakes). They’re sometimes dubbed “disconnects.” Victims struggle
to connect symptoms to disease, and they suffer from disconnects between
intellectual and working beliefs. The symptom reflex is often suppressed by
mismanaged emotions, including an inability to compensate for a lack of
emotional response to mistakes.
The
best learning occurs when you recognize and respond to wrongness. This is why
failing in life is so integral to understanding it. Failure reveals more flaws
and incompleteness’ in your understandings than success while provoking a
greater emotional response; and conscious beliefs insufficiently supported by
emotions have little effect on people’s decisions and perspectives. I don’t
believe in more than discouraging failure. Barring extreme cases, parents
shouldn’t stop kids from failing by force, manipulation, or pressure to conform
to the “guidance” of those who supposedly know better. And yet parents do, even
in cases where their attempts inevitably cause way more immediate harm
than good. However powerful, the human intellect is mostly an appendage to a
social and emotional psyche, a psyche, ironically that’s all but defined by its
need to believe exactly the opposite.
But
the precious few who excel at life learning don’t just connect systems to
disease and turn knowledge into understanding: They perpetually rehabilitate.
They know it’s unwise to believe in cures and keep their eyes peeled for vestigial
beliefs, the supporting beliefs lingering after the elimination of major
misconceptions. Belief networks are in perpetual flux, and transitory
beliefs in the form of gut reactions and the like constantly arise,
micro-corrupting networks (discussed in the third conceit). Life is about being
a work in progress; that’s the journey. The destination is the asymptotic
approach to solving its puzzle.
Many
respect people with strong beliefs. I respect those who are smart about
their beliefs. But someone who’s smart about their beliefs
knows that this goes beyond having beliefs that are smart. Sometimes, being
smart about your beliefs may mean boldly proclaiming them at the expense of
offending people, but most of the time, it’s about knowing what not, or
not yet, to have an opinion on. Ultimately, it’s about
holistic management of beliefs, which is synonymous with life learning and
managing one’s psychology. You’re charged with the operation of a network you
have limited control over; be careful what you allow into it.
In
the end, you don’t exercise selectivity because you don’t care about your
beliefs: On the contrary, it’s because you care
about them—and the puzzle of life.
The
next conceit presents the sentient duality--that self-awareness is
always accompanied by self-delusion—as an alternative means of understanding Unwrongness.
Comments