The Epistemology of Irrationality 2.0
The following is the second edition of The Epistemology of Irrationality, which has been removed. The two articles are very similar, but the current is longer, posits more signs of irrationality, and has additional theses.
The pinnacle of irrationality is the idea that a person could ever be fully rational. Thus, it could be said that a definition of irrationality is the failure to ask, “Am I irrational?”
Rationality, to whatever degree a human can possess it, is largely the ability to learn from mistakes and misconceptions (and/or success at doing so). This includes those of other people. All misconceptions should be considered irrational traits, however common or trivial. A person’s biology plays a major role in how they reason, and some people will manifest characteristic traits early. But in the end, no three- or four-year-old is rational; they gain rationality because they LEARN. To learn, one must ask the question—is this irrational?
Naturally, it’s good to have right answers, but it’s a much smaller part of good judgment than generally assumed. Being good at avoiding WRONGNESS, on the other hand, is vastly underappreciated. It’s harder to be right than most think, as reality is less comprehendible than usually believed, and it’s less necessary. A competent thinker is good at avoiding being wrong in ways that matter, and they’re good at correcting their mistakes—or wrongness—when they inevitably are. Misbeliefs should generally be considered more dangerous than correct answers are beneficial, and hubris more dangerous than ignorance.
This post has three theses undergirding a fourth major thesis:
One, rationality and proficiency/success at INTUITIVE life learning are nearly one in the same. Though obviously productive, articulating and systematizing them as I have is different: Doing so is more of an intellectual process, doesn’t say as much about one’s WORKING understanding of life learning/philosophy as most think (especially Bishko), and involves more intelligence, which is only an ADVANTAGE to cultivating rationality. Two, irrationality/insanity tends to be strongly related to a lack of integrated understanding of (or serious weak-links IN) one’s thinking, psychology, and life.Thirdly, as a result of the first two, life-learning is largely the ability to connect individual mistakes and misconceptions (including those of others) to one’s general thinking, psychology, and life. Finally, as a result of the previous theses, life management and learning are holistic in nature.
Life, like Nature, is a complex system: It has countless, known, unknown, and unidentifiable variables undergoing complex interactions. Economies, ecosystems, and societies are other examples. And complex systems can ONLY be understood with HOLISTIC analysis—analysis of the WHOLE. Unlike reductionistic analysis, or analysis of the PARTS, holistic analysis assumes neither parts nor whole can be accurately accessed by a straightforward analysis of the parts--only by seeing how they fit TOGETHER.
Nothing in life exists by itself in an isolated universe: Beliefs, decisions, models of reality, goals, attributes, pieces of knowledge—all interact with themselves and each other in complex and dynamic ways. It’s unwise to think you can reliably understand the effects of any one of these things by looking at it by itself; you must see how it fits in with everything else.
In fact, an INTEGRATED understanding is tautological: Any TRUE understanding IS an integrated understanding. UNDERSTANDING, as opposed to KNOWING, necessarily means knowing how the facts or parts fit together. Even in systems that can be ANALYZED reductively, such as what I call COMPLICATED systems like automotive systems, experts very much know how the parts fit in with each other.
If life can only be understood and properly managed by a holistic approach, irrationality must be the failure to do so.
An integrated understanding, however, is far from just intellectual; it requires a comparatively high consistency between one’s intellectual and working beliefs, which, in turn, requires great consistency on a conscious and unconscious level (at least by a reasonably attainable standard; having a fully consistent overall belief set is a total physical impossibility). Although some issues are more important than others and priorities vary from person to person, one must understand all aspects of their life and thinking and how it all fits together. For example, if someone is bipolar, their individual bipolar dynamics may be paramount, but their rehabilitation must extend to--and be integrated with--all areas of their life, thinking, and psychology generally. While a holistic approach to these things is hardly novel, it remains underappreciated and will be addressed at length.
Extralogical reasoning axiom: Insanity/irrationality is never entirely local, nor total. Irrationality is always supported by other irrationality, but save for extreme cases, delusional people will still be rational in some ways, even if only to rationalize their delusions. Thus, irrationally is always found alongside rationality. This makes delusions harder to find.
However impulsively and illogically it may often be done, since the conscious and unconscious minds are constantly trying to reinforce their beliefs in mutually supporting packages, it should, therefore, be assumed that the more a belief or model of reality deviates from the truth, the more beliefs and models of reality that will come to support it will also deviate from it, including future beliefs and models of reality, which can only be guessed. Wrongness, in other words, can metastasize into other areas of your thinking, leading to more wrongness. Allowing yourself to be delusional about one thing causes delusional reasoning to spread into other areas of your thinking, impairing your models of reality and ability to make decisions.
As you should be able to infer, holistic management of BELIEFS is a major part of holistically managing your thinking, decisions, and psychology—and, ultimately, your LIFE.
Extralogical reasoning axiom: The “understanding machinery” (the cognition involved in understanding things) of the human thinking organ and the “delusionary machinery” (the cognition involved in delusion) are probably more alike than similar; and the more powerful the former, the more powerful the latter. All else being close to equal, smart people might be better at apprehending truths, but, all else being close to equal, they’re also better at creating packages of beliefs that protect themselves from truths they want to be oblivious to.
Extralogical reasoning axiom: It should be assumed that the more powerful and (perceived to be) necessary a delusion, the more beliefs, true or false, required to support it.
Note that all the axioms above involve inconsistences in beliefs/packages of beliefs.
The Six Major Red Flags of Insanity/Irrationality
Any form of bad thinking could be a sign of insanity. Bishko once said, “To say bad thinking is a form of insanity is almost the same thing as saying bad thinking is a form of bad thinking.” Something similar could be said of immaturity. Therefore, one must keep the number of red flags and signs of insanity/irrationality to a relative minimum.
ONE: Doing the same thing over and over, failing to get what you want, and keep doing that thing, expecting different results
This is a cliché, but it’s no less true—and it’s one of the very few I enjoy hearing. What I’ve never heard anyone say, however, is that epistemically, it’s usually the result of a failure to heed the scientific method—or look at the data. Many people keep doing something because they think it “SHOULD” work or err in ignoring or failing to act because it “SHOULDN’T” be happening.
In my experience, few use the scientific method. In fact, important studies have shown that SCIENCE doesn’t use it nearly as thoroughly as believed. The overwhelming majority prefer speculations. Some may even have basis, but speculation can only be intelligent speculation if it’s recognized for what it is.
Based on the number, intangibility, and complexity of the interactions of the potentially relevant factors/variables involved in most things, predicting results and explaining reality are more speculative than most believe. The natural world is filled with phenomenologies: readily observed phenomena that can’t be adequately explained based on existing theories/explanations.
Not all successful salesmen are flashy and glib, and many flashy and glib salesman aren’t successful; many different types of guys have been successful with women, and many who SEEM like they should be are not; many different types of athletes and types of techniques have been successful at the same sport/position; not all people who seem smart and logical—or who ARE smart and logical—with an INCLINATION for mathematics are actually talented at it; not all personable and attractive teenagers are socially successful who want to be (or as much as assumed), and not all socially successful teenagers are personable and attractive; many people who are clearly mentally ill don’t fit a particular diagnosis; plenty of married couples with similar philosophies can’t agree on how to apply it to their relationship and, especially, their children; having the right beliefs, intentions, and strategies by no means guarantee success; many medical phenomena don’t fit contemporaneous medical theories; and physics is largely based on attempts to explain phenomenologies.
The scientific method exists every bit as much due to human limitation as brilliance. Science couldn’t be successful without the latter, but its method exists because of the former. A priori theorizing has limited reliability regardless of the human performing it. You have to look at the data. If you think the theorizing of geniuses is always successful, then you haven’t looked at the data from science: For ninety-nine percent of its true history is the story of geniuses who dedicated their lives to developing theories that ultimately proved to be ninety-nine percent WRONG.
The phenomenological fallacy occurs when someone denies data or its default conclusion because it can’t be explained in satisfactory terms. Deliberate and unconscious cherry-picking of data is common throughout all human endeavors. In addition to people having preferred beliefs, this is due to what extralogical reasoning call’s the causation bias: the natural tendency to be way too quick to assume the relationship between cause and effect is ascertainable and satisfying, including in randomness.
You don’t always have to have satisfying explanations to make decisions and model reality.
Diagnosis is not necessarily a prerequisite for physical or psychological treatment; you don’t necessarily need to know why a relationship does or doesn’t work to know whether it does or not; you don’t necessarily need to know why a strategy does or doesn’t work; you don’t necessarily need to know why you are or are not good at something; and you don’t necessarily need to know the root cause for irreconcilable differences.
Yes, things need to be given their due chance, and things tend to progress, change, and improve nonlinearly, making sudden and dramatic change or improvement common. However, people are programmed to believe the opposite. ER calls this the linear illusion, and it’s a common reason why people tend to overestimate what they can achieve in the SHORT term and underestimate what they can achieve in the LONG term. Talent can be dormant: There can be psychological problems and/or miscommunications between teacher/coach/boss and student/athlete/employee that can lead to certain types of ruts; and some natural abilities take longer to develop than other natural abilities. But in the end, without confirmatory data, prognoses, proposed strategies, and the like are just hypotheses. This is especially so given that they’re often non-falsifiable, meaning at best they can be proven right and never be proven wrong; and a claim that can't even in principle be disproven should only carry so much weight.
Should someone permanently quit or move on when they reach what APPEARS to be the precipice of committing red flag one? No. But certain questions must be asked:
"Is there something to lose by continuing?" "Am I enjoying it or possibly still getting something out of it, even if not what I'm ostensibly supposed to?" In other words, are there things going well or are there auspicious signs, even if they're not what would be most satisfying? "Am I getting frustrated?" “Is there a sufficiently NEW strategy I can try?” And ultimately, "Am I just going to be banging my head against the wall by continuing?"
If the answer is "yes," then at least a prolonged break is in order. And usually, especially if this juncture has been reached before, one should probably only return in a casual, low-risk scenario and/or with a sufficiently different approach.
Theoretical analyses can fail or partly fail for any number of reasons: failure to account for the complexity and unknowability of reality; failure to account for unknown variables; confusing correlations with causation; the fallacy of indications, confusing the indications of a property with the manifestation of the property itself (e.g., confusing symptoms of a disease with the disease or the symptoms of talent with the manifestation of talent itself)—etc. When it comes to analyses of talent, for example, especially those performed by young people, there tends to be an excessive focus on the most ostensible and glamorous attributes and a shortage of consideration of potential weaknesses and the importance of mundane and nuanced abilities (see article of judging talent). Potentially relevant weaknesses include "potential weak-links": areas where one doesn't necessarily need to be strong, but if they’re weak, it's a serious problem. Many things are "weak-link-oriented," laden with potential weak-links.
One of the most naïve and dangerous beliefs a person can possess is the idea that just because they WANT to help a situation and MIGHT have some of the HYPOTHETICAL abilities or resources to help that trying to help can’t hurt. Some things are better left alone. Naïve interventionism—that is, naively interfering in a situation to help at the risk of causing harm—can be a manifestation of this red flag. Naïve interventionism has ruined relationships, wrecked economies, and literally killed people. Trying to reconcile irreconcilable differences that don’t need to be reconciled, for example, is one of the most socially dangerous things a person can possibly do. Because of the primitive, unconscious, and self-inconsistent nature of the human thinking organ, it’s impossible, even for a philosopher such as I, to catalog all their beliefs. Any belief that’s of significance to a person will be part of a package(s). Packages may be riddled with contradictions and illogicalities, but important beliefs and pieces of knowledge will always be inextricably entangled with others. Thus, it’s almost always easier to agree in principle than practice—and this assumes that people are making a genuine effort to communicate and get on in the first place.
Naïve interventionism is symptomatic of insanity because it’s often a manifestation of red flag five: the inability to distinguish between one’s beliefs and intentions and behavior. Naïve interventionists are so caught up in their emotions they can’t distinguish between their beliefs and intentions and REALITY.
One can also be guilty of red flag one BY ASSOCIATION. If someone close to you continues to make the same mistake again and again in a way that affects you, continuing to interact with them in the relevant area(s) means you’ve now taken on their insanity. I’ve been in situations like this, and the results were absolutely disastrous. While I was aware of how insane the involved parties were, I continued to interact with them because I thought I had almost nothing to lose. Wrong. If nothing else, one always has their SANITY to lose (assuming, of course, they have some to begin with).
When demonstrating red flag one, rationalizations are usually found to justify continuing to do the same thing. Packages of beliefs are disentangled and reformed, leading to inconsistences in one’s understanding of their thinking, psychology, and life. In such cases, people’s understandings of the involved parties, activates, and/or facets of reality will likely undergo repeated iterations to justify past failures, making questionable relevant beliefs a sign of red flag one.
Moreover, as touched upon, when people commit reg flag one, they often allow themselves to take not-insubstantial risks because they insist that ascertainable and satisfying answers are available. In other words, they do the same thing again and again without desirable results because they’re convinced that the actual results can always be EXPLAINED. Obviously, any good problem-solver will make efforts to understand why things go wrong, but they’re also good at recognizing when such efforts have been exhausted or should be suspended. Part of avoiding wrongness is avoiding paying a price for lack of answers.
Forcing answers, such as psychiatric diagnoses, is not necessarily without cost. I don’t dismiss any of the motivations for satisfying explanations, including social and emotional ones, but thinkers most heed the ER axiom: However impulsively and illogically it may often be done, since the conscious and unconscious minds are constantly trying to reinforce their beliefs in mutually supporting packages, it should, therefore, be assumed that the more a belief or model of reality deviates from the truth, the more beliefs or models of reality that will come to support it will also deviate from it, including future beliefs and models of reality, which can only be guessed. Wrongness can metastasize into other areas of your thinking, leading to more wrongness.
Clearly, someone committing this red flag is not learning from their mistakes.
Two: A disturbing tendency to overestimate what one can achieve in the short term and underestimate what can be achieved in the long
This is the counterpart to red flag one. Impatience, lack of life experience, and the linear illusion (or the illusion that Nature changes linearly rather than nonlinearly) can be causes--but so can’t irrationality. Mania is addictive, and when mania becomes an end in itself rather than a means to an end, mistakes such as above become all but inevitable. Many manic depressives have difficulty imagining motivation or progress in the absence of success-induced mania--or failure in the absence of depression and despair.
Three: Similarly-minded people with sufficient information are coming to drastically different overall conclusions about your rationality
The third largely speaks for itself and is mostly a symptom or manifestation of the fourth red flag. However, it should be of note that this can be used as a reference to one’s rationality and rehabilitation. When one is trying to manage certain issues—such as mania or anxiety generally--it can often be difficult to establish how much progress one is making: After all, you’ve only been yourself, and it’s hard, if not impossible, to have perspective.
An application of this is what I call the blowhard test. To gauge one’s rationality, a person can ask, “minus my claims, SHOULD my likeminded friends think my claims are sincere, or should they think I’m a blowhard?” All else being anywhere near equal, non-blowhards will tend to make a point of operating to ensure they pass. For example, if they have to delay working on a project, they'll be careful not to talk (and especially brag) too much about it because the combination of talking and not doing is characteristic of a blowhard. Self-conditioning the right thinking, attitudes, and behavior are important, and all else being anywhere near equal, it's better to maintain good credibility with your peers.
Four: A disturbing disparity between one’s KNOWLEDGE of issues related to their life and rationality and their UNDERSTANDING.
This is a very straightforward manifestation of the definition of irrationality proposed in the introductory paragraphs—a disturbing lack of an integrated comprehension of one’s thinking, psychology, life, and related issues. Such a person may have a sense of what their weaknesses are but have little sense of how they FIT TOGETHER--or how MUCH they affect their ability to run their life.
The above explains what people sometimes called “disconnects.” Disconnects are caused by thought disorders or cognitive forms of mental illness, unresolved psychological issues, the immediate and long-term effects of mental illness, bad epistemic instincts, and/or blivots--beliefs/ packages of beliefs created to protect people from truths they’d rather not be aware of.
Remember, beliefs come in packages, including delusions and bad instincts. Blivots have a way of disentangling packages of otherwise well-integrated beliefs. In the “restructuring” of packages, blivotal logic will metastasize through a person’s thinking, distorting their models of reality and impairing the victim’s thinking and decision-making capacities. But tread lightly, for not all the blivotal packages will be illogical; some may even make perfect sense, even if their origins are questionable.
Every time you make a mistake or demonstrate a bad instinct, there’s a good chance there’s a GENERAL lesson to be learned. Typically, the bigger the mistake, the greater chance it’s the result of more general issues. A disinclination or obliviousness to how individual mistakes indicate more general problems is a serious red flag.
Extralogical reasoning believes that just as your attitudes and thinking affect your behavior, your behavior affects your thinking and attitudes (albeit less so). Self-conditioning through practicing the right thinking and behavior in all areas of your life is essential to your development. In addition, good management of your thinking requires avoiding being wrong. That victims of red flag four fail to see this as bad for their self-conditioning and life management is another red flag.
The sub-ultimate cause fallacy occurs when someone confuses a trigger, catalyst, or immediate (sometimes called proximate) cause for an ultimate cause. A trigger is a stressor that sets events into motion already on the edge of being set into motion; a catalyst is a stronger, more “involved” stressor that provides a path or context to set events into a motion prone to occur; an immediate cause is a combination of a catalyst and a secondary cause; and an ultimate cause is the “true” or primary reason why something happens.
Usually, psychological episodes are accompanied by sub-ultimate factors. People with unresolved psychological issues often blame episodes on sub-ultimate causes. This can be a sign of rationalization and denial (of the issue itself and/or the need for certain treatment) or, in some cases, just plain bad thinking. But it’s not as simple as it sounds. Because of the causation bias and the HTO’s tendency to try to make sense of reality and rationalize its actions and feelings resulting from artificial resonance (the cognitive mechanism in animals responsible for minimizing confusion at the cost of understanding oneself, reality, and causality), the mind will reflexively “latch onto” whatever factors can hypothetically explain the crisis accompanying emotions; and stress increases a person’s need for satisfactory answers. Perhaps in part because of this, you can find people who acknowledge the MAGNITUDE of the irrationality of their responses to events without acknowledging that there is a distinct psychological issue.
Failure to discern an ultimate cause is a failure, or sign of a failure, to see the bigger picture and find a common origin for various behaviors.
Just as being an “emotional wreck” can be correlated with the above red flag, so, too, can too LITTLE emotion. Evolution doesn’t integrate. It ASSIMILATES. It only adds functions to the extent to which it complements or enhances what’s already there. What was already there prior to the rise of sentience was mostly primitive and reflexive. Emotions and intellect are meant to work together—but not too much, not too little. Ideally, a person should have a negative gut reaction to making a mistake, just enough to prompt them to perform the necessary rehabilitation and reflection. Like how you don’t necessarily need to feel hungry to get adequate nutrition, you may not NECCESARILY need a negative gut reaction to correct your mistakes, but if one doesn’t tend to, more wisdom and discipline are required. Such people often don’t care about being wrong—another sign of irrationality.
Oftentimes, victims of red flag four have little inclination or ability to distinguish between beliefs, intentions, and plans with their behavior and actions—itself a red flag. Having the right beliefs and intentions are obviously important and should be accounted for, but they don’t guarantee anywhere near as much as most think—and what matters most in life is what you actually DO. Victims can’t see the “disconnects” in their thinking and psychology—otherwise, they’d be “connected.” Bishko called precognitive entrenchment the curse of too much thinking one’s reality is ALL reality. All else being close to equal, the greater the emotions, the greater the precognitive entrenchment. Sometimes emotional wrecks get so caught up in their emotions they’re blinded to the external world.
The defining attribute of simplemindedness is the inability to imagine things in any way other than exactly the way they are or the way they’re supposed to be. If the presumed results of certain intentions and feelings are considered automatic—that the RESULTS are necessarily what they’re supposed to be--this makes red flag four a red flag for simplemindedness and the definition of simplemindedness a red flag for insanity. As ER has rammed down its readers throats, no matter how good your resources and nominal plans, intentions, abilities, etc., it’s unwise to believe there’s such a thing as a guaranteed benefit or a guarantee of success. Yet almost everyone insists otherwise.
Sometimes, it’s difficult or impossible to even have a conversation with the victim about the inconsistences. Unwillingness to face certain issues and/or acknowledge the unconscious reliance on blivots requires denial, usually resulting in a tendency to refuse to answer pertinent questions (a secondary red flag). As a general rule, disinclination to answer simple questions is a clearcut sign a person is hiding the illogic in their thinking.
Five: A disturbing tendency to plan way too late and/or way too early—especially alternating between the two.
Most would agree that planning too late is a sign of irrationality and indiscipline.
Recall that wrongness leads to more wrongness.
There’s no such thing as a good decision or conclusion without knowing the context in which it was made, and decisions and plans are always reinforced by beliefs. Planning too early is, therefore, basically the same thing as having premature beliefs and making premature decisions, which are biasing and easily lead to wrong answers. As mentioned, holistic management of one’s beliefs is central to holistically managing one’s life. Excessive planners are probably failing to see the holistic nature of beliefs, which alone makes their management of everything else suspect.
Even disregarding ER’s belief that the setting of goals is more of an EFFECT of motivation than a CAUSE, don’t commit the fallacy of assuming that just because setting goals might indicate ambition that the goals themselves will be useful. A good symptom may ONLY be a good symptom; it may or may not have any intrinsic value. Goals are also set by blowhards and those seeking excitement and motivation without regard to consequence, which naturally makes it symptomatic of the pursuit of mania as an end in itself (which isn’t an official red flag only because it’s too obvious). This means excessively early planning can be symptomatic of impulsivity, as well, especially if plans frequently change and are combined with too LITTLE planning.
Six: Failure to see any of the others as red flags.
In other words, failure to ask, “Am I irrational?” and, therefore, learn from your mistakes. Or, in more nonclinical cases, “Am I wrong about something?” If you deliver someone an onslaught of arguments and criticisms that they’re helpless to defend and they won’t consider the possibility they’re wrong about ANYTHING, it’s an instance of the fourth and probably third red flags, in addition to the sixth. In some cases, people try to create the appearance their thinking isn’t fucked by admitting they’re wrong but only about something that’s mostly “self-contained,” a fault they can view in isolation that is less likely to lead to a deeper inquiry.
A good life learner will not only learn the essential lessons from their most consequential mistakes, but, when applicable, will extract general or “larger” lessons from mistakes that are in themselves inconsequential and/or excusable. Good thinkers can do this because they have a good enough sense of how their beliefs are connected to figure out where they AREN’T. This is not to say that prioritizing isn't important. Some issues will require more attention than others and these priorities can vary a great deal from person to person; but effective management of your thinking and psychology is always holistic.
Additional Signs of Insanity
Inconsistencies in BEHAVIOR too disturbing to be explained by indiscipline, indulgence, and/or hypocrisy.
This is usually the result of red flag four (disturbing inconsistences in beliefs).
The manic person who alternates between planning way too early and late (red flag five); the paranoid conspiracy theorist who thinks the government is constantly spying on everyone but still threatens to bomb the White House on the phone; the person who, in some ways, goes to great pains to protect their reputation but still makes a habit of committing horrible social blunders.
Irrational evaluations of OTHER PEOPLE’S lives.
Bishko said, “It’s impossible to apply any thinking to your own life without thinking it’s at least to some extent applicable to everyone else’s”; delusions are always supported by other delusions, which often pertain to the external world and other people; and beliefs require reinforcement.
Examples include the addict who’s always justifying other people’s addictions; the grandiose person who encourages other people’s grandiosity.
An inability to manage goals, standards, and expectations.
This is often a sign of mania, obsessiveness, and delusions of grandeur. Mild or hypo mania tends to be useful for motivation--but not when aggrandizements start becoming delusions. Mania is addictve, and people who suffer from its immediate and long-term affects make decisions and have views of happiness and success excessively based on it. And once again, it’s a dangerous end in itself.
The psychological dynamics of “feeling good and productive” and pursuing long-term goals is easily effective enough for people to be successful, but they’re far from perfect. A large part of happiness is not so much the TOTAL amount of success or gratification a person has achieved or how good their life is NOW, but how quickly they perceive it to be IMPROVING—greener pastures, as some say. Secondly, it’s believed that people receive more joy from the NUMBER/FREQUENCY of successes than the sum TOTAL of their success—they usually get more joy winning ten grand a week in the stock market for a year than win it all at once, for example. Note that perceived improvement is heavily influenced by the frequency of success. Thirdly, the same is true of failures and disappointment—most people get more disappointment from LOSING ten grand a week for a year than losing the same total amount at once. Finally, failure causes a disproportional amount of grief than success does happiness.
The conscious may not be aware of these things, but the subconscious is—or at least enough to allow them to wreak havoc on your life. Manic people often pursue gratification to meet the above criteria at the expense of optimal decision-making, causing them to put off doing certain things while over-focusing on others, etc. They may alter their beliefs about the methods they need to achieve their goals if certain measures bring more gratification than others. For example, an athlete might overemphasize SUPPLEMENTAL training methods if he thinks he can improve more at them than from his sport.
Bishko called an Inversion an instance when a means to an end becomes an end itself. In some cases, an inclination toward obsessiveness manifests by showing how easily it’s DIVERTED. A wrestler I know became so obsessed with supplementary endurance-training he started regularly skipping wrestling practice to run six-hundred-yard dashes even though wrestling was not only his sport, not middle distance running, but wrestling itself is still the best form of conditioning FOR wrestling (supplementing it with other forms of conditioning allows an athlete to get more TOTAL exercise without getting overtrained).
Everyone has a natural tendency to overestimate what they can achieve in the short term and underestimate what can be achieved in the long. Whatever maximizes gratification today or this week may or may not be what maximizes results over the long run. Bad days—or even weeks or months or years—is inevitable, and progress is usually nonlinear. There may not be such a thing as “trying too hard,” but there’s certainly such a thing as trying stupidly. “Trying too hard” is a good example of an explanation that’s wrong in WORDING but often correct in INTENDED meaning. In actuality, this is the result of a flaw in one’s understanding of what’s required for proficiency/success at the relevant activity, and this, in turn, is usually the result of a flaw in mindset or mismanaged expectations. For weightlifters, powerlifters, and athletes performing strength training, fixating on daily record-breaking can lead to overtraining, a lack of total volume of training, and, especially, debilitating ruts. Since there are limits to everyone’s potential, having excessively high standards and goals are a problem, but it’s less problematic (or at least less common) than having excessively high expectations.
Believing in Cures and Guarantees
ER axiom: Just because something’s possible doesn’t mean it’s likely, and just because it’s likely doesn’t mean it’s something you should RELY on. ER axiom: It’s extremely unwise to think something will happen TO YOU; think you’ll have to make it happen.
Mania and delusion can obscure the possibility of things going wrong. You may not always know HOW things can go wrong, but it should always be assumed they can. The benefits of having the right beliefs, plans, and intentions aren't as readily realized as most think; and anyone who thinks you must believe in guarantees and cures to be confident is confusing hubris and confidence and optimism and obliviousness. Among other things, optimistic solutions and outlooks, like simple ones, should still sufficiently explain the facts. Even if in many cases the benefits of simplicity or optimism outweigh a certain deviation from facts, this is something that should still be handled in a deliberate and self-aware way; one should not simply pretend inconvenient facts don't exist. These misconceptions are common, but they're pronounced in those suffering from delusions. Obliviousness to ways things can go wrong include denial or obliviousness to one's weaknesses and inattention to mundane details/tasks.
In other words, believing in guarantees isn't just a SIGN of insanity; it IS insane. Believing things are all but guaranteed that aren't so easily leads to indiscipline, complacency, delinquency, etc. that even if there's a tiny chance you're wrong, it isn't worth it. Usually, there’s everything to gain by denying guarantees and nothing to lose. If you deny a guarantee, you don’t need to be right, but if you believe the guarantee, you do. Genuinely ambitious people seek ways to motivate them; belief in guarantees encourages the opposite. If you point out the ills in believing in a guarantee and someone responds by squabbling that it is, they’re probably protecting the FANTASY of success.
Believing in cures is, likewise, a sign of obliviousness to ways things can go wrong.
Extralogical reasoning principle of habilitation: When you're trying to correct certain types of problems in your thinking and psychology, it's almost always way easier to know when you're making PROGRESS in correcting or lessening the problem than it is to know whether or not you're cured or whether you've become competent in ways you weren't previously: You've only been one species and only one MEMBER of that species; it's hard, if not impossible, to have perspective.
Since beliefs come in packages, just because the primary or articulable misconceptions go away doesn't necessarily mean all the beliefs that supported them also go away; these can linger indefinitely. Extralogical reasoning calls these vestigial beliefs. The subtle nature of vestigial beliefs makes it all too easy to mistake progress for cure and/or improvement for competence; and this mistake, in turn, often leads to relapses. Thus, extralogical reasoning recommends the policy of never believing in cures for ANY form of psychological or epistemic problem, even if it's as relatively innocuous as a bad decision-making instinct. Because red flag four is often a failure to see the extent to which bad instincts and delusions can infect one's GENERAL thinking, failure to see the folly in believing in cures is, in turn, a sign of red flag four.
Lastly, being too quick to think a problem is cured can be sign of denial. For example, someone might engage in a certain irrational behavior on a weekly basis, then stop for two months and start acting like the problem’s gone. However, it may not say AS much about the rationality of a younger person since they’re often too quick to think current trends are permanent—then again, this is, likewise, a sign of irrationality.
Assuming things are always the way they tend to be or the way they’re supposed to be
This could be considered the definition of simple-mindedness, but, naturally, it’s not terribly rational, either. Since no one would actually say or admit they think this, it can be a sign of red flag four (disturbing inconsistencies in beliefs). Its most obvious manifestation is the quick assumption of rational agency, that people and events will always be directed by rationality and human organization. The Sham is the widespread con/delusion that rational human agency predominates society.
Disinclination to answer simple questions
Unwillingness to answer simple questions is a sign that a person’s hiding illogical thinking. This often comes up in conversations about someone’s irrationality, and if it does, it indicates an unwillingness to accept or deal with the problem. Sometimes people get so fixated on certain things you can’t have the same conversation—which is the next sign of irrationality.
A tendency to get so fixated on certain topics they can’t have the same conversation with the person they’re (supposedly) speaking to
Emotional wrecks or people in an irrational state often get so fixated on topics, arguments, ideas, etc. they can’t get off them—even when they’re peripheral to the nominal topic. This often manifests when someone can’t accept counterfactual arguments. For example, sometimes you might want to be a specific point and you’ll say something like, “Well, even IF A were true, B would still be true.” You might even be fully AGREEING with them, that B is true and A is not; you’re just merely reinforcing the truth of B. If the person is so fixated on A’s falsity that they can’t get off it or start accusing you of believing A when you’ve made clear you don’t, they’re probably too irrational to have the conversation.
Inability and/or refusal to see things from another’s point of view, especially during disputes
The precognitive entrenchment of emotional wrecks often makes it impossible for them to see things any other way. Deniers of irrationality might not want to take the other view if they think it leads to facing their illogic.
Being too quick to think present trends are permanent
As noted, this is frequently demonstrated by young people, but it could also be a sign of impulsivity, impatience, mania, and/or pessimism in adults.
While there may be great variation in how and how much they affect individuals, everyone’s been guilty of most of the irrationalities discussed herein. Since having a fully integrated belief set is impossible, one can only be so rational. However, if a person really can’t learn from their mistakes and/or CAN’T acquire sufficiently consistent beliefs, they’re likely suffering from a more deep-set form of irrationality, such as clinical mental illness or a cognitive issue.
A famous writer once wrote, “Good judgment is a question of experience, and experience is a question of bad judgment.” Similarly, rationality requires experience at irrationality—and learning from it. Success may indicate greater learning, but if you avail yourself of the lessons, you learn more from failure: Beliefs unsupported by emotions and unconscious beliefs don’t have much impact on people’s actions and general perspectives, and people are disproportionately affected by failure; and, as discussed in the introductory portion, good judgment usually has more to do with avoiding harmful wrongness than ascertaining correct answers. This means you can also learn more from the failures of OTHER PEOPLE than you can from their successes. Such as mine expounded over the many pages above.
Comments