Diagnosing vs. Understanding: The Epistemology of Psychiatric Diagnostics as a Societal Phenomenon
There’s a difference between diagnosing people and understanding them. Many don’t know the difference. Often it seems the more one learns of the former, the less perspective they have on the latter.
KNOWING means knowing the facts; UNDERSTANDING means knowing how they fit together. Diagnosing is implicitly synonymous with understanding but actually means knowing how the facts fit a preconceived model in a highly inexact science. PSYCHOLOGICALLY, understanding is all that’s required. At best, diagnosing is a means to that end. Even PSYCHIATRICALLY, precise dialogistic specificity is rarely if ever necessary. Basic reasoning alone tells you diagnostics are catalysts for confusion and bias--that is, twisting the facts AWAY from each other.
That said, extralogical reasoning (ER), or at least its creator, does not wholesale “reject” psychiatric diagnoses, nor deny that there may be “good” reasons they exist. More than anything else, they’re inevitable, and ER has no ambition to change the system, or delusion that it or any arbitrarily large body of philosophers ever could. Bureaucracies and communications within the field(s) and the health care and legal systems are simply too constraining (and may lead to problems not treated herein). But this doesn’t mean the general population needs to use them, nor members of the field(s) be married to them.
In accordance with its revised name, Diagnosing vs. Understanding: The Epistemology of Psychiatric Diagnostics as a Societal Phenomenon, this article addresses EPISTEMIC, not bureaucratic/legal issues with diagnostics as a general societal phenomenon. Before I move on to that, however, I’d like to address general problems with the field, most of which are also inevitable.
Self-correctivity is a pool of beliefs' ability to correct itself, especially from its mistakes. It can be used to assess the reliability of a pool’s beliefs, individually and collectively. Correctivity is based on the quality of incentives (especially social), clarity of objectives, accuracy of testing, mix of competition and cooperation, and age of the field and/or specific principles within. Classical physics and mechanical engineering are highly self-corrective. Being inexact sciences, even if psychology-related fields were “intellectually pure,” and not heavily constrained by the health care and legal systems and the need to appeal to clients, psychology and psychiatry’s self-correctivity would be low. Their pool of beliefs is influenced by those of society, as well. Due to the complex criteria human psychology places on beliefs, society’s pool isn’t terribly self-corrective, either. These entanglements lead to quite an epistemic mess.
Even science and engineering’s self-correctivity is limited. As I’ve said several times, ninety-nine percent of the true history of science is the stories of geniuses who spent their entire lives developing scientific theories that ultimately proved to be ninety-nine percent wrong. And I assure you, few lacked for confidence in their ideas. While I’m not suggesting people can’t be interested in the truth, an inescapable part of the human condition is that you will always be more interested in beliefs than truth, both simply having strong beliefs as well as particular beliefs that suit specific purposes.
Right off the bat, the mental health field(s) low self-correctivity makes the field’s principals suspect.
Owing to rampant knowledge idolatry, or blind worship of knowledge, the field treats knowledge as more important than wisdom and critical thinking skills. Worshiping knowledge, as opposed to merely admiring it and treating it as important, exacerbates people’s already-excessive attachment to preferred and preconceived ideas. Although most people, including knowledge idolaters, would conceed wisdom is more important, because most of your learning will be dominated by fact-based knowledge, the inexorable result will be to condition yourself to prioritize it. This is epidemic in academia (see Current Thoughts on Knowledge and Society ) and is especially problematic in low self-correcting fields where judgment is so crucial. Knowledge idolatry and diagnostics are a match made in hell.
Problem One: Too easily become “pet explanations”
Pet explanations are explanations people resort to as a matter of course without sufficient regard to relevance and applicability. There are great misconceptions about beliefs and reasoning. Good reasoning is context-based. Life is not so simple everyone can go around labeling things as “good” and “bad” and straightforwardly base all their thinking and decisions around it. This is why wisdom is necessary. Still, judgment is the ability to make decisions despite incomplete information. A system must have a means of COMPENSATING for a lack of available facts, and general beliefs, rules of thumb, and even dogmas are among potential solutions. But this is confused with beliefs intended to ELIMINATE the need for context-based thinking, deemphasizing it.
In fact, this confusion is essentially preprogrammed. Evolution follows the path of least resistance. The idea of quickly figuring out what works, selecting for it, then proceeding to way over-rely on it at the expense of mediocrity and vulnerability is such a common theme of the path of least resistance you could almost call it the path of least resistance itself. Overreliance on prior learning and preconceived ideas worked well enough (in simpler evolutionary environments) and became inherent to thinking organs along humanity’s ancestral line.
The result: Prepackaged models necessarily precondition conclusions, biasing its users.
Problem Two: As a general phenomenon, diagnostics treat correctness as more important than avoiding being wrong.
Because confusion, inhibition, and distractions can be almost as detrimental to an animal or person’s dealings with reality as understanding it is beneficial, sometimes the feeling or belief in understanding and rationality can be almost as important as the real thing. Following the path of least resistance and not caring for truth for its own sake, evolution selected for cognitive traits that default to comfortable solutions and models of reality so long as they roughly approximate the facts. Better, it supposed, to doctor reality a little to make a tolerably simple perception of it than allow for a more "correct" one at the cost of disjointed thoughts and images. Artificial Resonance twists correlated events into causational relationships, constructing a reality based on a “harmonious narrative.” This may include suppressing information relevant to the truth. Resonance, in short, is the cognitive editing process that doctors unconscious and sensory inputs into the coherent experience known as “reality” (see summary intro for better explanation).
Resonance also works ex post facto. When new information comes in about past events, Resonance works reflexively and unconsciously to assimilate it, imparting the events with an artificial simplicity and predictability (and making a son’s Asberger’s all the more pronounced).
Resonance is a brilliant compromise that’s been wildly successful in vertebrates, but it remains responsible for most fallacious reasoning. Resonance twists correlation into causation. Since correlation usually isn’t causation and statistics only directly measure correlation, correlational and statistical fallacies are among Resonance’s most devastating logical consequences. Resonance is also responsible for the causation bias, the natural tendency to be too quick to assume the relationship between cause and effect will be ascertainable and satisfying; confusion between observation and interpretation and, thus, jumping to conclusions; and rationalize (which isn’t just emotional, ethical, or social), reinforce beliefs, harmonize the language of models of reality with the models, and harmonize models of reality with reality itself.
The causation bias and tendency to jump to conclusions makes correct beliefs seem easier to obtain and more necessary than the case.
Resulting from Resonance and the need for comforting and motivating beliefs, everyone has a cognitive and emotional reflex(s) to reinforce their beliefs in mutually supporting packages, however illogically and inconsistently. It should, therefore, be assumed that the more a belief or model of reality deviates from the truth, the more the beliefs and models of reality that will come to support it will also deviate from it, including future beliefs and models, which can only be guessed. Everyone has a similar reflex to harmonize the language of their models of reality with the models and another to reinforce their models of reality with reality itself. It should, therefore, by assumed that the less the language of models reflects the truth, the less their accompanying beliefs will reflect it.
In sum, wrong beliefs, the wrong wording of models, and undisciplined thinking and use of language result in more wrong and inconsistent beliefs.
“Forcing” diagnoses of questionable correctness leads doctors and patients to twist facts to support them, leading to more wrongness. Often diagnosis is sought for emotional and social reasons. I don’t fail to emphasize with this—I’ve been there myself—but the greater the desire for the diagnosis, the greater the tendency to reinforce it.
Resonance makes humans underestimate the dangers of wrongness and overestimate the ease and necessity of correctness. In life-related judgment, it’s at least wise to believe that avoiding wrong beliefs is more important than correctness. Diagnostics, on the other hand, is premised on the opposite, especially as a societal phenomenon.
A habit of avoiding needless wrongness cultivates disciplined thinking, and needless wrongness requires letting go of needless CORRECTNESS. There are more than enough things in the World to be opinionated about; you can afford to be picky. You won’t lose IQ points by missing a succesful diagnosis or be ostracized for eschewing diagnostic vocabulary (in fact, you’ll become less glib). And employing it isn’t necessarily harmless--which brings me to the next issue.
Problem Three: As a general phenomenon, diagnostics are not well-defined and are subject to the whims of society’s ever-evolving lexicon and pool of beliefs
Labels are often misused and over-relied upon. Obviously, you don’t necessarily need words to describe the obvious (or even describe it all), and people often become unduly attached to labels and apply them without sufficient regard to relevance and applicability. This has been discussed. But people may misinterpret this to mean language doesn’t matter. Actually, it matters a great deal.
Talking and communicating aren’t synonymous. Business is conducted using certain terminology, but people don’t always know the terms’ real effects (or if business is being conducted at all). Diagnostics is subject to the same quasi-aimless evolution of society’s other “laymen understandings.” People define the terms differently, and you convey information by the very fact you choose to say or not say something. ER and Bishko’s life engineering calls the information you convey by your CHOICE of what to say/not say concomitant information. It’s generally better to be use words with broad meaning than ones with DIFFERENT meanings, and words that are vague but are supposed to be specific can be dangerous. Diagnoses have different meanings that are SUPPOSED to be specific but aren’t. This presents problems for those attempting to communicate.
Until almost 40, I was weak, powerless, and confused because I couldn’t systematize and explain the things I needed to in order to function. Having undergone the necessary transition, I can tell you how much language shapes how people think. This is reinforced by the analysis mentioned in problem two.
Another issue is whether diagnostics is more centered on truth or practicality. Truth is not the only criterion for the utility of a belief. Obviously, both are important, but which diagnostics prioritizes isn’t (and probably can’t be) sufficiently clear. It becomes especially problematic when you factor in the potential effects of living in an environment hugely different than the one the species was programmed for. It may not always be apparent whether someone’s biochemically off in a way that would present the symptoms REGARDLESS of environment or what’s the result of their individual response to the CURRENT. The answer may or may not affect whether or how a person’s treated, but at least as a societal phenomenon, diagnostics don’t adequately address this.
Finally, there’s additional problems with Resonance and how language can perniciously influence your beliefs and models of reality. For this reason, you should not assign familiar terms to yet-to-be identified observations. People love attaching diagnostics to symptoms: i.e., “I’m so OCD like that.” In fact, ADD and OCD are colloquial synonyms for lack of focus and obsession. This isn’t just sickeningly glib. Resonance will try to fit the words to your understanding of your psychology. You can have the symptoms of an ailment without the ailment, and this is always the case with spectrum disorders, which most disorders are. In sum, prematurely attaching diagnostic language to symptoms easily leads to false diagnoses.
Recommendations (and additional consequences for not following them)
As said, labels aren’t always necessary.
You don’t need diagnosis to apprehend symptoms, and you don’t need a diagnosis to know what you can and cannot do. I don’t know if ADD exists, but I do know that a deficit of attention prevents me from following a lecture and reading at a reasonable speed, making it infeasible for me to be a historian.
You don’t need a diagnosis to make decisions. Whether you have the best attention span in World or the least, if you read better in the early evening but write better in the afternoon, operate accordingly. If you focus best when there’s instrumental music in background, play it. If you study best after a light twenty-minute jog, jog first.
You don’t need diagnosis to connect symptoms to other symptoms. If you have trouble picking up on “social cues,” it shouldn’t be too surprising (though not guaranteed) that you’re prone to offend people by speaking impulsively. When you discern symptoms, it’s wise to ask if there are similar ones you can connect them to. Life learning is learning from flaws and mistakes, and that, in turn, is connecting individual flaws to problems in your general thinking, life, psychology, etc. and adjusting accordingly.
When lengthy analysis of these disorders isn’t decision-altering, it can lead to poor epistemic conditioning, distractions, and unintentional excuse-making and justifications. Analysis of ADD, for example, might a leading cause of attention deficit and failing to complete important tasks. However benevolent teacher and professionals’ intentions, this is probably an epidemic problem in special education.
Life Engineering called make-believe analysis (MBA) any analysis that can’t (even hypothetically) affect a decision or improve your understanding of a concept in a meaningful way. Like sloppy use of diagnostic terminology, MBA isn’t just glib and pseudointellectual (or give the author suicidal ideations); it promotes a false sense of importance/priority. A major part of thinking and decision-making is understanding what’s important and their relative importances. Treating something with an artificial sense of importance FOSTERS an artificial sense of importance. This conditioning isn’t necessarily brainwashing, nor chiefly social and emotional—but cognitive. Due to this and Resonance, immersing in the wrong things, using the wrong language, and coming to premature conclusions compromises one’s understanding of the relevant topics, especially as relates to applications.
Ultimately, navigating a complex World with an imperfect thinking organ is more about reasoning skills than knowledge of general psychology, which is the exact opposite of the field’s methodologies and pedagogy. This is not to say that no INDIVIDUAL professionals understand that wisdom is more important or that ER can replace knowledge of psychology, but diagnostics and the fields’ emphasis on knowledge create problems, nonetheless.
This might be an inescapable consequence of university education. Fact-based knowledge is far more abundant and much easier to teach and assimilate into university structure than ER would be, and humans are more influenced by conditioning than teaching. If students spend far more time immersing in something less important than another, regardless of what authorities SAY—or even what the students end up believing INTELLECTUALLY--they will encourage the opposite WORKING belief. In other words, there’s often a large difference between people’s intellectual beliefs and their WORKING beliefs, and the former is more influenced by teaching, the latter by conditioning.
But just because these problems are inevitable doesn’t mean they’re inevitably YOURS (e.g., you’re not destined to diagnose every offbeat person as “on the spectrum”). Your thinking is blighted by societal conditioning and an imperfect thinking organ operating in an environment it wasn’t designed for; epistemic rehabilitation requires self-conditioning and reengineering. This article is merely an introduction. If you’re serious about knowing how to think, measures must be taken that go beyond the intellectual. For that, I refer you to the summary intro.
Comments