Current Thoughts on Knowledge and Society
As an epistemologist, a student of knowledge, my understanding of learning and thinking are ever-evolving. I realized recently that there are a few issues I haven’t adequately addressed and wanted to improve extralogical reasoning’s treatment of them. For one, I’ve been so busy railing about what’s wrong with people’s understanding of knowledge that I haven’t spent ample time discussing its benefits, which I’ll do in the beginning. Though, as characteristic of my other works, my viewpoint will be more ideological than practical, my next objective is to stretch my analyses a little more into larger societal issues relating to learning. Given how poor most people’s understanding of knowledge is, the following will inevitably focus on failures in comprehension and application and their respective causes. But don’t underestimate the educational benefits of negative examples; most of extralogical reasoning’s understanding of knowledge comes from it.
Most of the following has been discussed in other posts, but if nothing else, it will be a good review for those who haven’t read any of my epistemological posts in a while. One important issue that won’t be discussed in detail are universal flaws in human thinking. Since extralogical reasoning (ER) posits that its existence and modes of thinking are necessary to compensate for them, the following finds much support in the third portion of the intro.
Knowledge can inform your or someone else’s actions directly by providing specific solutions/answers and indirectly through understanding/intuition, allowing you to improvise and innovate solutions/answers. Gaining knowledge informs your knowledge of knowledge and understanding of understanding—your WISDOM. This can further inform your or someone else’s actions by improving your understanding of decision-making and learning in general.
People’s overall learning is increased by being wrong about less things and identifying what they DON’T know; much of their ability to use knowledge and make decisions is about NOT MISUSING knowledge and making WRONG decisions. When you learn one thing is right, you necessarily learn no less than one thing is wrong. One or more of these are likely something you yourself were wrong about or could have been wrong about. The presentation of much knowledge is accompanied by how the discoverers disproved previous theories that were often premised on fallacious reasoning. As mentioned in other posts, ninety-nine percent of the true history of science, for instance, is the story of geniuses who spent their entire lives developing scientific theories that ultimately proved to be ninety-nine percent wrong. This teaches you about your weaknesses and vulnerabilities, including your emotions and host of universal cognitive flaws (see part three of intro). Though it’s foolish to ignore your instincts and gut reactions, especially when making tricky decisions, it’s just as foolish to think they’re infallible.
Learning truths and falsehoods and how human flaws can mislead decreases your vulnerability to the machinations of the people, institutions, and dogmas that threaten your thinking and beliefs. Being more familiar with the fallibility of your gut reactions also informs you that just because something or someone provokes an undesirable gut reaction doesn’t necessarily mean it’s inherently “bad”--“different” doesn’t always mean “bad” or “less.” This COULD be partially responsible for decreases in bigotry.
In sum, while few people sufficiently understand and believe in science and humans will always be human, their thinking has become much LESS unscientific, making them less vulnerable to corrupt leadership and leading to a better (or LESS BAD) world—or so people thought.
Finally, learning is healthy and can bring great satisfaction. Something tends to be a more effective means to an end when it’s also an end itself. An Inversion is when a means to an end becomes an end itself (from Bish’s life engineering). Learning was central to human evolution, and it’s unlikely if not impossible that evolution would select for a trait or ability without a corresponding inclination to ACT upon it. Thus, people are programmed to want to learn. The “power process” is another evolutionary Inversion: the need to demonstrate to oneself that he/she is good at surviving (see the works of Ted Kaczynski). Just as wanting to be good at math in general is useful if you want to do well in a given math course, wanting to be good at surviving in general is obviously advantageous if you want to survive your current circumstances. Today, since survival is all but guaranteed, people pursue the power process through “surrogate activities.” Learning is a common and potentially effective surrogate activity.
However, two of the most important lessons of life:
One, no matter how good your resources and nominal plans, objectives, and intentions, it’s unwise to believe there’s such a thing as a guaranteed benefit or guarantee of success. Put another way, It’s profoundly stupid to believe that something can’t be poorly pursued just because it’s admirable in itself. Two, there’s no such thing as a good decision or conclusion without knowing the context with which it was made. People have different interests, strengths, weakness, resources, values, and responsibilities. Life is not so simple that everyone can go around calling things “good” and “bad” and straightforwardly base all their thinking and decisions around it. This is an all-else-NEVER-being-equal world. This viewpoint is for LEARNING, not direct application to real-life circumstances. Wisdom is necessary BECAUSE of how much of a simplification it is. Life’s complicated by the fact that everyone is different--especially since they’re also all the same. Don’t let society’s heavily survivors-biased dogmas mislead you.
Arguably the most important scientific discovery in the past several decades is that people’s beliefs play a greater role in how they EXPLAIN their decisions than in DETERMINING them. People greatly underestimate the role the psychology plays in determining one’s actions and general perspectives. This in addition to the fact that it tends to be much easier to gain knowledge than understanding and proficiency makes it much easier to change someone’s intellectual beliefs than their WORKING beliefs. Most academics and proponents of learning would on some level acknowledge this; but just as the research would have predicted, this is not consistent with their own working beliefs (this paragraph is heavily support by part three of the intro).
Sadly, too few have sufficiently learned these lessons.
So beware, since all knowledge can be misused, it can increase your knowledge of WRONG solutions, potentially leading to poorer actions. Since anything can be misunderstood, attempts at learning can lead to mislearning: either “misknowledge,” or, even if the piece of information understood in itself, false assumptions and, thus, misunderstanding.
Knowing means knowing the facts; understanding means knowing how the facts fit in with each other; proficiency means knowing what to do with it, which often requires knowing how those facts fit in with other facts; and wisdom means knowing what all these things are and how they, in turn, fit together. Well, if you know facts but you don’t have proportional knowledge of how they fit in with each other, you don’t have proportional knowledge of what to do with them, and you don’t know what these things are in general--you have a self-inconsistent understanding of something that is naturally prone to misusage, especially if knowing the facts gives you an artificial sense of confidence. There’s nothing unusual about a person incapable of learning one fact without falsely extrapolating another three. Hubris is more dangerous. And the fact is, very few people care about wisdom and understanding. There’s a difference, after all, between learning and information addiction.
As said, learning is a joyful and healthy experience, one recommended by extralogical reasoning, but this can be a pernicious trap if certain things aren’t sufficiently recognized and treated for what they are. One consequence is an excessive attachment to certain ideas, “pet explanations,” causing people to apply them without due consideration to their applicability to specific situations; and this, in turn, can lead to an excessive attachment to preconceived ideas IN GENERAL. A belief is by definition a prejudice and a prejudice by definition a bias. Extralogical reasoning is making significant efforts to expose the unspoken fallacy of what it calls the “all-applicability” of knowledge: the idea that there‘s little danger IN or danger to unconsciously ADOPT the belief that a piece of knowledge is applicable to every situation for which it is HYPOTHETICALLY applicable to. Any piece of knowledge can be misused, and it’s easier to forget that than generally assumed. This doesn’t mean a person can’t get a significant net gain from a piece of knowledge, of course--but not if the resulting biases aren’t kept in check.
Even if you could argue that this isn’t always the case, that some concepts can be applied without much thought, it is still unwise to think otherwise. After all, if something is so useful and you understand it so well, you shouldn’t have any difficulty applying it. On the contrary, if you like the concept so much and you’ve TRULY comprehended it, it’s usually harder to NOT apply it. This might not always seem so because people often fail to truly learn the lessons they’re exposed to. Many people are trained in certain concepts but don’t appreciate them enough to apply and/or remember them. Ultimately, people are designed to over-rely on preconceived ideas (see the intro to extralogical reasoning); the human thinking organ doesn’t need any more help to act on its favorite concepts.
A contender for the greatest flaw in thinking and learning is the idea that principals should only reflect the truth, or what’s OSTENSIBLY most important. Rather, it should reflect what requires MOST EMPHASIS; and to this end, one must account for psychological factors and potential weaknesses in thinking, individual and universal. Your use and management of specific and general knowledge, for instance, should optimize your exploitation of them, not simply reflect how "good" or beneficial they are at face value; and this must account for factors that might hinder your exploitation.
You can’t cultivate a type of learning without cultivating a type of MINDSET. The theory-oriented mindset usually doesn’t lend itself well toward practical things. It’s important to recognize the difference between the decision-altering and explanatory powers of a piece of knowledge—or how its practical benefits compare to its ability to explain things. Although suggesting anything with explanatory power has at least some direct or indirect practical benefit might work as a loose rule-of-thumb—it ONLY works as a rule-of-thumb. Explanatory and decisional-altering power are easily confused, especially with concepts you place great pride in understanding. Theory-oriented people focus too much on the WHY rather than the WHAT WORKS. For example, learning factoids about neurology and neurochemistry beyond what’s necessary to understand basic biology distracts from the learning and application of more enlightening and useful concepts like those found in extralogical reasoning or cognitive psychology (the word “dopamine” has come to elicit suicidal ideations in the author).
As Yogi Barra once said, “In theory, theory and practice and the same, but in practice, they’re different.” I’m a philosopher; I should know.
Knowledge idolatry is rampant. Make no mistake: Idolatry is not respecting, admiring, treating as important, nor is it optimism or idealism—but blind worship. Regardless of something’s potential benefits, if you worship it blindly, you necessarily condition yourself and others to have a poorer understanding of it. One would think that if something is so wonderful and useful, you wouldn’t want to compromise your understanding. Idolatry is an enemy of learning: It distracts from learning; it can lead to dogmatic learning and over-reliance (i.e., blind zealotry); and the acts of raving and worshiping are often confused with the act of acting or effectively acting on the lessons that do manage to get learned.
To reiterate, it’s not just about the specific concepts, but one’s GENERAL thinking and how he/she CONDITIONS it. Regardless of how much wisdom someone has, if they’re a serious general learner, their learning is going to be dominated by fact-based knowledge. Wisdom and critical thinking skills are less abundant, less tangible, and harder to communicate. Thus, worshiping learning indiscriminately enviably leads to the cultivation of a working reasoning system that devalues wisdom and context-based thinking and decision-making.
The stereotypical knowledge idolator cares little about mundane specifics. When giving advice, he/she will try to make a byline for whatever preconceived ideas will seem to explain the situation at hand, and if they can’t match the baseline facts with one of their pet explanations, they’ll tend to default to simple-minded cliches. However mundane, mundane specifics are no less important, and knowledge idolatry obscures their importance. And advisors should not just consider mundane specifics relevant to the situation at hand: They can provide information and context for future advice, and your interest in them can gain you credibility with the advisee.
The problem with what knowledge idolators say about learning isn’t so much WHAT they say so much as HOW they say it and what they DON’T say and, therefore, what they IMPLY. Knowing the wrong combination of important pieces of information about something can be highly misleading. It’s very easy to mistake a piece of knowledge for the whole picture—even if that piece of knowledge is over half the picture.
Desire for knowledge is often social—to show off, make conversation, make friends, etc. Extralogical reasoning accepts that people—ALL people—have these sorts of desires, and it has no ambition to completely eradicate them. But once again, if you want to be good at using it, these things must be sufficiently recognized and treated for what they are. And if you don’t care about applying your own learning, at least don’t pretend otherwise and miseducate other people.
Knowledge idolators tend to be quite bent on having opinions, even when they’re potentially harmful. In fact, it’s quite common for a person to be more interested in having an opinion than being correct. Do I think opinions are “bad”? Of course not. I respect people who are SMART about their beliefs, but part of being smart about your beliefs is knowing what NOT, or not YET, to have an opinion on--and that opinions, unlike decisions, are usually OPTIONAL.
An opinion you don’t understand isn’t an opinion; it’s a dogma, assumption, or, at best, a speculation. Coming to conclusions prematurely can unnecessarily bias you, making you prone to twist facts to suit explanations rather than vis versa. A design flaw of the human thinking organ is that it doesn’t actively distinguish between WHAT it observes and how it INTERPETS what it observes. Left to its own devices, without your deliberate intervention, it will blur observation and interpretation together. This obfuscates much of the above and makes people prone to jump to conclusions. Extralogical reasoning asserts that being particular about the opinions you choose to take on lessens your tendency to jump to conclusions and improves your ability to identify your assumptions.
Similarly, extralogical reasoning asserts that it’s wise to think wrong beliefs are more dangerous than correct beliefs are beneficial. Though right answers are obviously desirable, they’re less necessary and harder to find than generally believed, as reality isn’t as readily explained as most think. The illusion is attributable to the causation bias, the natural human tendency to be way too quick to assume that the relationship between cause and effect will be ascertainable and satisfying.
The causation bias and related phenomena are consequences of evolving in a simpler environment with less variables, and responsible for numerous misconceptions about the modern World. Science and engineering are exceptional in that it’s easy to test one’s understanding and less need to account for unknowns, making them unusually self-corrective. It’s the availability of rigorous testing that makes science and engineering difficult, not that they’re complicated or unnatural modes of thinking. Don’t let the success of science and its idolatrous admirers fool you: Without self-correctivity and social structure to test, motivate, and manage them, the smart and learned don’t enjoy half the advantages they claim to. It’s not that extralogical reasoners have the answers so much as they’re aware of and know how to compensate for the lack of them.
While never seeking explanations for life-related matters would be tantamount to having a policy against problem-solving, don’t let yourself pay an unnecessary price for lack of answers. People suffering from mental illness and their doctors, for example, often twist facts to suit diagnoses, leading to other problems. Extralogical reasoning wouldn’t be necessary if the scientific method was the only set of epistemic principles that mattered, but ignoring it in life is at least almost as stupid as ignoring it in science.
Moreover, don’t confuse opinions and decisions. The thinking behind them have much in common, but, unlike beliefs, decisions are mandatory; understanding, though obviously desirable, is not always possible; and there could be serious consequences if you’re wrong. In some cases, dogmas, cheap heuristics, and gut feelings might be wonderful relative to the alternatives.
As you can see, extralogical reasoning considers knowledge and understanding more important than opinions. There’s no shortage of things to be opinionated about, and if you’re so bent on having an opinion on a given issue, do your fuckin’ homework. And if you aren’t willing to do that, well than it’s evidently not terribly important to you. However inclined people may be to acquire opinions before they have sufficient knowledge and understanding, even in the best of cases, it’s still not something to be encouraged. Just because a problem can’t be cured doesn’t necessarily mean in it can’t be lessened, compensated for, or managed—and it can certainly be made WORSE. And ultimately, the only standard in life that really matters is how you compare to your own personal potential, not other people.
School, especially at the middle school and high school levels, and the voting system solicit the opinions of ill-informed people to “engage” them, which has heavily dumbed-down socio-political thinking and other societal issues. The basic idea is that opinions catalyze learning, participation, and constructive debate. Greater democratic participation gives society a better sense of what the People want, and more competition between contrasting views can, in theory, increase the self-correctivity of society’s beliefs. But the latter won’t make much difference if the additional participants aren’t doing their homework and talking out of their asses, and dumbing down society’s socio-political ideologies and their epistemic foundations inevitably corrupts and dumbs-down the thinking of the country’s leadership.
Middle and high schools tacitly encourage dumbed-down thinking. Large-scale socio-political events don’t happen in an isolated universe; they happen in a large, complex, and dynamic world--which, in turn, has a large, complex, and dynamic history. Naturally, it’s wise to assume that understanding such events should have additional prerequisites: familiarity with economics, the theory of government, the legal system, and the relevant histories etc.—few of which they have. They may not even have enough of the baseline facts. All this promotes reductive, assumptive, and hubristic thinking. Critical thinking skills are more important than knowledge, and hubris is more dangerous than ignorance.
The proper priority is wisdom, understanding, knowledge, then opinion. Even if there are some legitimate justifications, academia too often has this ass-backwards. A more specific consequence of the mis-prioritization of learning, lack of independent learning, and the relevant idolatries is the exacerbation of the availability heuristic bias: the natural human tendency to overestimate the importance of the information one has about something and vastly underestimate the potential significance of the information they don’t have. A variation of this bias is what I call the quantity heuristic bias. A small percentage of a sufficiently large quantity is still a large quantity. This has reinforced a misguided belief in dirigisme, the central planning of economies (or portions of them) like that found in socialist countries. The amount of information in economies is astronomical, too large for any arbitrarily large central body to possess; but because the information they have IS so large, it tends to lead to a gross overestimation of the percentage of the TOTAL. Though some government oversight is obviously necessary, only the economies themselves have enough information to be their primary overseers. But real-life economies don’t always have ascertainable answers and explanations, whereas school’s models do; and economies aren’t idolized, whereas academics and their knowledge are.
As stated, information addiction should not be confused with true learning—and the pursuit of the former is not just a questionable use of one’s time. Excessive immersion in data and information impairs a person’s ability to distinguish “signal from noise”—or their understanding of what’s important and what isn’t. It’s impossible to receive a piece of information without having an impulsive interpretation of it, as mentioned (see the works of Nassim Taleb). When receiving information in relatively small quantities, the upside usually outweighs whatever biasing effect may result—but not when received in enormous quantities, especially when the information is speculative, contractionary, poorly confirmed, sensationalized, and cherry-picked to appeal to the prejudices of the masses.
Though there may be some ways that school encourages critical thinking, academia still treats knowledge as more important than wisdom—a direct violation of the axioms of extralogical reasoning. To some extent, this is unavoidable, as knowledge exists in greater abundance and is easier to teach; and unfortunately, people are more influenced by that they’re CONDITIONED to believe or do rather than what how they’re ostensibly taught, perpetuating the misconceptions. Entirely the fault of academics themselves or not, the misconceptions persist unchecked. Nowhere is knowledge idolatry more inappropriate than in the mental health field, where life-related critical thinking skills are crucial. There are many reasons why psychiatric diagnoses exist—some may be good, some bad, some merely inevitable—but they remain the epitome of pet explanations, leading to a great deal of circular analyses. Among the absent epistemic principals is the application of the universal flaws in human thinking to the DOCTORS’ thinking, despite their discovery by related fields.
Academia’s preference for “well-rounded” students over the past few decades is a manifestation and promulgation of knowledge idolatry. The well-roundedness they THINK they’re referring to isn’t broad learning so much as BROAD WISDOM. In other words, this doesn’t come from the CONTENT of learning so much as the comprehension of the spirt or philosophical essence of knowledge, thinking, and decision-making. But while school and many other things can HELP you acquire wisdom—enhance it, accelerate the process, provide essential corrections to one’s knowledge, etc.—this is still something you must discover for yourself. This requires independent or autodidactic learning, learning outside the confines and constraints of taking courses.
If true learning comes from knowing how things FIT TOGETHER, than one must understand things from different points of view. I don’t know if I’ve ever met a person who’s good with computers who isn’t primarily self-taught. Limitedly supervised inquiries are the purest and least biased way to explore a topic. The beauty of learning math and physics is not in knowing how to manipulate equations to get right answers, but that it’s a means of attaining an absolute understanding of something from first principles—much of which you discover for yourself.
Thinking about things from different points of view, such as doing math and physics problems different ways, can catalyze what Bishko called rampant self-doubt. According to Bishko’s life engineering, since beliefs, including unconscious ones, come in packages, it’s often difficult to truly question your understanding of one thing in a given area unless you’re able to question the entire package. But merely trying to do so can usually only get you so far. One must be put into a state where they FEEL doubt, like they can’t HELP but question their beliefs. In math and physics, a great catalyst is what I call “disproofs”—why WON’T this method work? how is this true but that is not? The best opportunity for learning occurs when confusion arises after a learner encounters multiple concepts or facts that APPEAR entirely at odds with each other but are, in fact, entirely consistent—that is, if one truly understands how the facts fit in with each other. The more of these situations that are remedied, the better you understand a topic.
While the pursuit of rampant self-doubt and discovery-based learning can certainly occur during semester, students must inhibit themselves to avoid and/or minimize debilitating ruts, fostering a general aversion to rampant self-doubt. Moreover, school must have homework assignments with ascertainable answers and explanations provided by the course itself, instilling an oversimplified sense of reality and discouraging the use and development of one’s intuitions (I admit this oversimplifies school itself a little bit). Thus, an over-reliance on school-learning can be intellectually emasculating.
This is not to say autodidactic learning is infallible--far from it, it’s a slower and less reliable learning process that can lead to pernicious holes in a learner’s knowledge. And though much rarer, it, too, can be over-relied upon, as evidenced by Bishko himself (and an indirect cause of much of his friends’ irritation). The point is that ALL forms of learning are necessary for one to reach their potential, and academic idolatry and people’s partially resultant lack of autodidactic learning is one of the reasons why people aren’t.
I’ve found that few proponents of learning, especially academic ones, appreciate how great of a role APPRECIATION plays in determining the benefits of learning. Every year and at every grade level, students demonstrate that one need not appreciate a subject to get an “A,” but regarding whether the material affects other areas of their thinking and whether they remember it in the months or years afterwards, appreciation becomes much more important. Once again, something tends to be a more effective means to an end when it’s also an end in itself. Having requirements to make people more “well-rounded” cultivates misconceptions about learning. Although there may be other factors involved, like the need to ensure sufficient participation to keep academic departments active, a school COULD make courses AVAILABLE for those who are interested without making them REQUIRED—and, in doing so, avoid promoting knowledge idolatry and engendering the dogmas and fallacies that have allowed it flourish (however, majors in those departments would take more courses in them, which would at least partially make up for it).
Despite all the science education people receive in school, few understand science and even less apply it to life. Despite all the history courses people take, few people know much of history or its nature, either. Why? Because they don’t appreciate them or the nature of knowledge itself.
There’s a big difference between obscurantism—that is, the suppression of knowledge--and throwing it in people’s faces without qualification or explanation. The former is immoral, and extreme ignorance is dangerous and would, indeed, lead to a much less productive society. But diminishing returns are realized much sooner than most think. Helpful though it may be, there were plenty of opportunities for learning outside of school before the Internet. Learning is healthy and there’s nothing wrong with taking pride and pleasure in it, but those feelings along with social factors appear to have created the illusion that its practical merits are greater and more guaranteed than the reality.
History often credits trends to calculated measures when the trends would have happened on their own. It’s the old post hoc fallacy: Just because A preceded B doesn’t necessarily mean it caused B (although when it involves politicians, it might sometimes be better described as the post hoc LIE). The causation bias, once again, is the natural human tendency to be way too quick to assume the relationship between cause and effect will be ascertainable and satisfying. With the causation bias comes “causational defaults”: God, the government and government executives, parents, academia, and measures created for the nominal purpose of reaching certain ends, etc. How did this seeming miracle happen? God? Why is this kid so screwed up? He has bad parents. Why is the economy in shambles? Poor governance. Why are people less racist? Civil rights movements, government initiatives, and education has made them less ignorant in general. What is chiefly responsible for technological innovations? Academia and formal education.
Obviously, these are possibilities (except God, who I don’t think directly interferes in the World), and in many cases, they are at least partial answers. But because of the ensnaring effects of the causation bias and causational defaults and the like, extralogical reasoning warns to be wary of explanations with “big shining lights on them.” Are the reductions in racism just the result of increases in education, government initiatives, and civil rights movements, or would enough contact with other races alone cause people to see they aren’t as inferior as once thought? Probably less of the former causes than most would prefer to believe, and there is evidence that rises and falls in racial equality don’t coincide with civil rights movements and related governmental measures as much as usually assumed (see the works of Thomas Sowell). Is academia and formal education primarily responsible for technological innovations? Probably not--industry has played a much greater role than generally assumed in BOTH technology and science (see the works Nassim Taleb). Is science education fully responsible for reductions in superstitious thinking, or is it at least some of it due to a lower death rate and less anxiety over death, which tends to engender superstitious and religious thinking? People in those days were more acclimated to death, yes, but that adjustment is doubtful to have fully compensated for it.
I could invoke the information age’s role in fake news and the rise of Trump--but I shouldn’t need to. None of the philosophy espoused in this post—or blog, for that matter—is especially sophisticated; it’s obvious to any educated person who isn’t too blinded by idolizing knowledge and intellectualism to objectively contemplate them. Extralogical reasoning respects people who are SMART about their beliefs. Being passionate about one’s beliefs is often smart, but, as mentioned in the beginning, it’s profoundly stupid to assume something can’t be poorly pursued simply because it’s admirable.
Like how people often want to have beliefs more than being correct, passion is often an end in itself. Once again, IN ITSELF, this is okay--but not if it unduly impairs your pursuit of it. Even if the gratification and social advantages of being passionate about learning are the most important things in your life, there’s still any number of ways your pursuit could go wrong. Knowledge idolatry, for example, is only a better means of showing off from the “short-sighted view.” If you’re willing to put in a little extra work and patience, it’s a better means of showing off, too, at least to anyone worthy of impressing, including the better of the knowledge idolators. There are other objectives that can be impaired, as well. Nobody only has one priority, and rarely do they do anything for a single reason.
Similarly, extralogical reasoning has no ambition to eradicate people’s desire to put faith in something greater or outside of oneself. Knowledge idolatry may serve this purpose, but extralogical reasoning is at least a partial alternative, especially if combined with extralogical Deism (posts on this forthcoming). And if you place your passion and gratification in your beliefs ABOUT beliefs, your knowledge OF knowledge, your understanding OF understanding—it becomes easier to be objective about everything else.
ER asserts that if you’re going to be passionate about something, do it the courtesy of pursuing it wisely. But doing so in this case requires an understanding that’s threatens the social and sentimental benefits of worshiping knowledge. If you can look past the differences in rhetoric, you’ll see only ER truly appreciates learning. Some knowledge idolators may want to have FAITH in knowledge more than understanding, a hallmark of idolatry. Though faith in the general sense is not necessarily ill-advised or mutually exclusive with understanding, faith requires belief in something tangible and a belief despite a less than complete understanding. Wisdom and critical-thinking lack tangibility, and in ER’s relentless pursuit to understand learning, it seeks to minimize any incompleteness in understanding, compromising its religious benefits. And it sure as shit doesn’t have the social currency and glamor enjoyed by knowledge idolatry. The author reserves his faith for something that he’s made useful in spite of being incapable of understanding it: God. But Deism’s beliefs about God aren’t bold enough to risk being especially WRONG, either. I’ve only added that the Universe is God’s experiment in self-organization, to see what phenomena would arise from entities doing nothing other than obeying certain unyielding laws. Unlike with knowledge idolators and knowledge, there isn’t any understanding that’s sacrificed.
Sadly, you can’t win credibility and propagate your ideology among the masses without packaging your beliefs into rhetorical dogmas—i.e., “knowledge is power.” In the packaging, the words, “supported by wisdom and understanding” were redacted for conciseness. To some extent, you have a binary choice: Be a bonified intellectual or seek gratification and social viability. I choose the former.
Comments