Talk on Holism of Life

 

 

The following is based on a talk I gave on Tuesday September 17th 


(Holistic thinking to life and beliefs) Tonight’s talk is about the holism of life management. It’s holistic because it applies holistic thinking to all areas of life. More than anything else, however, it’s about the holistic management of beliefs

 

(Tonight’s theme: avoiding wrong beliefs). An increasingly important theme in my works is the idea that it’s at least wise to believe that ascertaining correct answers is harder and less necessary than most people think--but that wrong beliefs are more dangerous. I’ll explain why this is the case and why it’s not an entirely unnatural way of thinking. This is one of the major ways ER reengineers peoples thinking. I act on this through concerted efforts to minimize the number of premature and unnecessary beliefs I have, which I’ll elaborate on. The general message is not to not have or care about your beliefs; it’s to have fewer total beliefs but work harder on the ones you have. There are more than enough things in the World to be opinionated about; you can afford to be picky. And just as other people shouldn’t decide for you what your opinion is; they need not decide for you what you choose to have an opinion about. When I need to be correct, I look to data first and expert opinion last, in accordance with the scientific method. 

 

(Pragmatic preferences): When it comes to strategies and modeling practical situations, although ER always recognizes the truth, it posits that the truth isn’t as informative as often believed: It doesn’t always reflect control or factors involving the avoidance of mistakes. You control some factors more than others, and accounting for the tendency to make certain mistakes, including conditioning yourself to be less apt to make them in general, is an incredibly important part of life management, learning, and judgment. Although, as I said, the truth is always acknowledged, I’ve come, to some extent, to eschew beliefs all together in favor of belief policies called pragmatic preferences. Pragmatic preferences reflect what requires most emphasis, which may or may not be what’s ostensibly most important or beneficial. Relevant prior learning, for example, however advantageous, tends to be over-relied upon; critical thinking skills are underappreciated. I’m not going to discuss pragmatic preferences tonight, but my last post is based on them, if you want to know more. I’m hoping for suggested pragmatic preferences because I need more specific and practical examples.    

 

(Misunderstandings from Last Time) There were some misunderstandings last time about what Extralogical Reasoning is. Q thought ER was something more akin to “super-logical reasoning”—“logic only better.” This is not the case. 

 

(What a super-logical system would be) A super-logical system would imply that humans are merely imperfect logic machines that can be made “better.” Crudely speaking, the human thinking organ is ninety-percent primitive, unconscious, and self-inconsistent and designed for practical ends—namely, survival in a simpler environment. A slavish devotion to absolute truth, correct answers, and the straightforward application of logic puts the thinking organ in a perpetual conflict of interest—one ER is specifically designed to avoid. Extralogical reasoning takes this extra consideration into account and provides extra methods to deal with it—methods that are designed to avoid wrongness and mistakes and manage weaknesses in thinking. But if you take all these extra things into account, it’s about as logical and scientific as you can get.  

 

 

(Holistic thinking vs. reductive thinking). As I said, you can’t perform holistic management of your life without holistic thinking. Reductionistic analysis, the opposite, is incomplete analysis; it assumes--usually but not always falsely--that both parts and whole can be understood just by looking at the parts. It just looks at the parts. Analysis by the parts, or individual variables. Life has too many variables undergoing complex interactions to understand the parts and whole just by looking at the parts—you must understand how they fit in with each other. This makes it a complex system: Many unknown and unidentifiable variables undergoing complex interactions. You have to look at the Whole. Holistic analysis is analysis by the whole.

 

(Holistic thinking in more laymen’s terms). Nothing in life exists alone. Beliefs, decisions, pieces of knowledge, attributes, resources, goals--all interact with themselves and each other in complex and dynamics ways. Regardless of how “good” or “bad” certain things may be in principleit’s wise to believe that they can’t be “good” or “bad” by themselves--only so in terms of how they fit in with each other.

 

(Holistic thinking isn’t natural due to evolving in a less complex environment) Complexity and nonlinearity—or nonlinear change—are not things that are entirely natural to people. That includes exponential change. People evolved in a simpler environment, and life was at least much less of a complex system and/or pre-sentient cognition didn’t have evolutionary time and/or sufficient incentive to make human thinking more holistic in the relevant ways. As Q pointed out, there are limits to how well people can think holistically, but either way, they ought to be doing a lot better.  

 

(A theoretical explanation for why reality seems easier to understand than is the caseConfidence or belief in understanding is sometimes almost as important as understanding itself. Confusion, distractions, and inhibition can be almost as harmful to a person or animal’s dealings with reality as fully understanding it is beneficial. To minimize confusion, animal thinking organs reflectively simplify reality by twisting related events into causational relationships, helping fit reality into a “harmonious narrative.” This probably includes, in one way or another, discarding stimuli or information that could be relevant to the truth. In the talk, my co-blogger, Q, said that the thinking organ seeks coherence more than correctness; it just so happens that correctness very often increases coherence. In other words, humans, and to some extent other animals, see reality through an artificial lens of simplified cause and effect. This might make animals’ perception of reality more “followable” and, therefore, minimize confusion; but, in the case of humans, it gives rise to the tendency to confuse correlation and causation and observation and interpretation. The human thinking organ is a causation-fixated thinking machine, unerringly designed to turn “the what,” “the how,” and “the correlated” into “the why.” Ultimately, this results in what ER calls the causation bias: the natural tendency to be way too quick to assume that the relationship between cause and effect will ascertainable and satisfying. 

 

(Theoretical explanation for why opinions seem more mandatory than is the case) The reflexive and fallacious “harmonizing” of potential causes and effects gives rise to the tendency to jump conclusions, which thinking organs are designed to do by default. This, in turn, leads to the illusion that having an opinion on any given issue is always mandatory (see part three of the ER intro or the article this talk is based on). Both the causation bias and the fallacious “need” to have an opinion are heavily reinforced by a democratic society with a media that panders to people’s causation-fixated thinking.

 

(Why wrong beliefs are harmful) Bish talked a lot about how your conscious and unconscious minds are bent on reinforcing their beliefs by putting them in mutually-supporting packages, however imperfectly. If the subconscious assures that beliefs support each other in packages, it should be assumed that the further a belief or model of reality deviates from the truth, the more the beliefs and models of reality that will come to support will also deviate from it, including future beliefs and models, which can only be guessed. Wrong beliefs lead to more wrong beliefs. Moreover, premature conclusions can unnecessarily bias you, leading you to twist facts to suit explanations, rather than vis versa. Wrong beliefs, in sum, have a way of metastasizing into other areas of your thinking, impairing your thinking, models of reality, and judgment. Since plans and goals are always reinforced by beliefs, premature planning and goal setting leads to similar problems. More on this in a bit. 

 

(Correctness is obviously advantageous, but don’t pay a price for lack of answers) Don’t get me wrong: Correct beliefs are certainly desirable and often perfectly possible, but don’t let yourself pay a price for lack of answers. “Forced” psychiatric diagnoses and forcing explanations for mental health problems are prime examples of this. While I very much empathize with the desire for diagnoses and explanations, they may not be available, even if your thinking organ tells you otherwise. Many people who clearly have psychiatric issues don’t neatly fit into a category, and forcing a diagnosis can lead to these pernicious effects.   

 

(Other problems with diagnoses) I’ve said before about diagnostics, it’s very easy for them to become “pet explanations”: explanations people resort to as a matter of course without sufficient regard to relevance and applicability. As a general phenomenon, they are contrary to Extralogical reasoning’s axiom that it’s wise to believe that avoiding wrongness is more important than being correct.   

 

(Be smart about your beliefs). Some might ask, “Don’t you respect people with strong convictions?” I respect people who are smart about their beliefs. Anyone who’s smart about their beliefs knows that this is about more than simply having beliefs that are smart. Sometimes, being smart about your beliefs may mean passionately proclaiming them. But usually, it means knowing what not, or not yet, to have an opinion on. Ultimately, being smart about your beliefs is about disciplined, holistic management of and reflection upon your beliefs.

 

(Control your system of belief by not making it unnecessarily large) Management 101: If you’re charged with the operation of a system you have limited control over, don’t make the system unnecessarily large. Focus on what’s important to you, and make the most of them—the beliefs you care about. As I said, there’s more than enough things in the World to be opinionated about; you can afford to be picky. And an additional benefit to subscribing to something like ER where you put your pride and faith (not blind faith) into your beliefs about beliefs, your knowledge of knowledge, your understanding of understanding—it becomes easier to be more objective about everything else. You feel less need, in other words, to have other beliefs, and this makes you less dependent on your chosen beliefs, allowing you to be more objective.   

 

 

Examples of poor management 

 

Failure to distinguish between decisions and opinions. 

  (Decisions are mandatory): Decisions are mandatory; understanding, though obviously desirable, is not always possible; and there could be serious consequences if you’re wrong. Everyone at some point will have to make a tricky decision and not know what to do; dogmas and cheap rules of thumb may be wonderful relative to the alternatives. 

 

(Making wrong decisions due to lack of understanding that may not be necessary). When I was younger, I didn’t like taking advice I didn’t understand. But when tricky judgment calls need to be made, it may be necessary to take the advice of an authority—including common “wisdom”--you’d rather not acknowledge as such. You can take someone’s advice without adopting their opinion. Don’t let your ego get in the way.

 

Believing in cures for epistemic and psychological problems 

   When you identify a tangible misconception, it doesn’t necessarily mean your “rehabilitation” is over. Since most beliefs (including biosap) are unconscious, just because a tangible misconception goes away doesn’t necessarily mean all the beliefs that supported go away with it; they can linger for quite some time. I call these vestigial beliefs. This is why it’s easy to mistake progress for cure and/or improvement for competence. Q pointed out that even improvement is often hard to establish. Obliviousness to vestigial beliefs can curb rehabilitation. Because of this, ER thinks people, whenever possible, should check in with other people for input. As mentioned in other posts, such as the third part of ER intro, you only have a simulation of proving right answers—convincing yourself. But however helpful other people can be in theory, they’re likewise limited, and it remains wise to believe that there is no such thing as a cure for any mental problems at all. 

   

Premature goal setting. 

   (Goal setting more effect than cause): First off, the setting of goals is much more of an effect of motivation than a cause. Yes, a person who’s highly motivated does have a strong tendency to set specific, official goals, but they set goals because they’re motivated; very little, if any, of their motivation comes from the goals themselves. Confusion between cause and effect is a common example of confusion between correlation and causation, one of the commonest fallacies.        

 

(Reminder of why wrong beliefs are dangerous) If you understand the nature of beliefs, it’s easy to infer that premature goals and plans are effectively the same thing as having premature beliefs and decisions/plans. Goals lead to plans, plans to decisions, and these things are all reinforced by beliefs. This leads to distortions in people’s models of reality that impairs, rather than enhances, people’s models of reality--the opposite of what goals are supposed to do. 

 

(Encouraged goal-setting despite missing information): Kids are encouraged to set goals when they have databases of missing information. There’s no such thing as a good decision or conclusion without knowing the circumstances with which it applies. Knowing the wrong combination of important pieces of information about something can be very misleading, easily leading to wrong answers. 

 

Gaining knowledge without regard to understanding or wisdom. 

   Knowing means knowing the facts; understanding means knowing how they fit in with each other; proficiency means knowing what to do with it, which often means knowing how they fit in with other facts; and wisdom means knowing what they all are and how they fit together. Well, if you know facts, but you don’t know how they all fit together, you don’t know what to do with it, and you don’t know what any of them are—you have a self-inconsistent understanding of something that’s naturally prone to misusage, especially if knowing the facts gives you an artificial sense of confidence. Hubris is more dangerous than ignorance. There’s nothing unusual about a person incapable of learning one fact without falsely extrapolating another three. But since knowledge is “good,” people act like you can’t possibly go wrong by pursing it. They don’t take a larger view; that is to say, the holistic one. 

 

Failing to see the bigger picture following mistakes

   (Everyone has misconceptions that come in mutually-supporting packages, and most are intangible): Again, beliefs come in packages, and most of these beliefs are intangible, which includes delusions, bad instincts, bad attitudes. Everyone has these things in one way or degree or another—and would have more if they weren’t doing certain things right. 

 

(See the bigger picture by connecting individual flaws/mistakes to systematic problems) Every time you make a mistake, there’s a decent chance it will offer you a glimpse at what some of these misconceptions might be. A person who’s good at learning from their mistakes knows this, and they can connect individual flaws and mistakes to greater problems with their thinking and general understandings. In other words, they don’t just learn the face value lessons from their major mistakes; they also, when applicable, can extract larger lessons from mistakes that are in themselves trivial and/or excusable. They ask the question, “What does this mistake or misconception say about my general thinking and understandings?” 

 

(Someone reveals their insanity by their inability to connect individual problems to systemic problems): Viewing individual flaws and mistakes at face value without seeing how it fits into the bigger picture is a hallmark of insanity. If the definition of one who’s irrevocably insane is someone who makes the same big mistakes over and over again and can’t learn from them, then one who never asks questions about what mistakes say about them in general is insane. They will view mistakes, including great ones, at complete face value. They might think, “My mistake doesn’t matter in this instance, or I’m already aware of these weaknesses; therefore, I don’t need to worry about it.”

 

Confusing skepticism and falsity and failing to be self-skeptical   

   Skepticism and falsity are not necessarily one in the same. Skepticism is healthy—but that includes self-skepticism. Pseudo-open-minded people lack skepticism of others, and the pseudo-skeptics lack self-skepticism. The former confuses open-mindedness with apathy, the latter critical thinking with arrogance. The insistence on opinions is reinforced by the desire to impress others and reassure oneself, which, of course, are especially pronounced in younger people. Few appreciate how insidious these opinions can be, and the potential benefits of the responsible emotions only complicate matters. 

 

Information addiction 

(Information addiction not true learning or consistent with ER) Information addiction is not true learning. Information addiction is not broad and intensive studying. Remember: Disciplined holistic management of your beliefs requires direct and indirect efforts to avoid premature, wrong, and contradictory beliefs—and information addiction makes this impossible. In fact, the very idea of being so interested in knowing all this stuff is largely contradictory with ER’s axiom that avoiding wrongness should be treated as more important than being correct.

 

(What’s wrong with information addiction) Although it can be managed and compensated for, the reflex to jump to conclusions makes it impossible not to consider a conclusion each time you receive information. When getting information in modest quantities, especially if it’s relatively reliable and consistent, the biasing effect should be small relative to the benefits of the information. But it’s another matter in enormous quantities when the information is speculative, contradictory, and designed to appeal to people’s causation-fixated thinking, which causes false and contradictory ideas to pervade your system of beliefs. Government and business executives who are exposed to, or could be exposed to, astronomical amounts of such information have been known to refuse a certain amount of information to avoid the effects of this type of informational overload. Informational overload also impairs the ability to discern what’s important and their relative importances, a central part of thinking, decision-making, learning, and problem-solving. 

 

 

Pretending not to care what other people think

 

A major impetus behind the evolution of intelligence was the need to ascend the social ladder to gain better access to resources, including mates. To one degree and in one way or another, everyone cares what other people think. Obviously, it’s not as simple and straightforward as people are all social climbers who want to be cool with the cool kids. It’s not even that simple when you’re a kid. But in the end, everyone cares. In my experience, in general, it’s better to pursue something questionable in a way that wisely minimizes damage than to pursue it and pretend you’re not. Again, wrong beliefs lead to more wrong beliefs; delusions lead to other delusions. Healthily accept that you care what other people think and you’ll purse it in a more dignified way—and you may even end up caring less what they think in some ways. 

 

 

ER says that all people have the same foundational attributes, but they vary a great deal in how they manifest themselves. Understanding people means understanding both. When I was younger, I thought being wise was all about how I was different than other people. I was a freethinker; I liked analyzing things; I liked applying my intelligence to life; and I hated dogmatism and blind conformity. Naturally, these things did, do, set me apart and are requirements for wisdom. But this is not what makes someone truly wise. What it comes down to is being aware of the universal flaws in human thinking and re-engineering your thinking accordingly. In the language of life engineering, this is an epistemic puzzle of life is, and ER is my solution.

 

ER, as I think you can see, has a low opinion of human judgment, including extralogical reasoners. But the worst thing about human judgment is not that it’s prone to be bad; it’s that few realize it’s prone to be bad and reengineer their thinking accordingly. Tonight, I hoped that I’ve shown that reengineering your thinking to make it more holistic and to think more in terms of avoiding wrongness than being correct is both necessary, and fairly involved.  

    

Comments

Popular posts from this blog

Intro to Extralogical Reasoning Part 3: Understanding Self-Ignorance: A Primer for Understanding Yourself and a World you weren't Designed to Comprehend

Complexity Theory and Extralogical Reasoning

Draugr City: An Epic Fictional Universe by NG Murphy