Advice to not think things through can be harmful. Here’s why.
In a dark basement, illuminated solely by the light of a laptop, sits the overthinker. His goal is malicious; he seeks to ruin your buzz, man, and he is a self-styled “philosopher,” wasting our time with navel-gazing, unanswerable questions like “what is the meaning of life?” He argues semantics on the internet, demanding that undefinable words be defined before actually addressing any arguments. Internet debate is a game of throwing poorly-thought-out opinions at each other, and he cowardly refuses to commit to a side.
The overthinker thinks deeply, but not clearly. The solution: he should stay away from harmful thinking, especially if it produces no immediate, concrete benefits to our personal lives, and fight our natural programming to think too much.
That’s the motivation behind a recent piece by Zat Rana, entitled “The Philosopher’s Problem: When and Why Thinking Can Be Harmful”. Unfortunately, Rana’s piece is written in a way that can be interpreted as crossing the line into anti-intellectualism. I’m going to argue that broadly urging people to think less, especially when that urging is done incautiously, is enormously harmful. If YouTube and Facebook comments on political articles are any indication, two things are obvious: (1) generally speaking, most people are not “programmed to think deeply”, and (2) our society is desperately in need of people to think more, and encouragement to do so, even when it leads to thinking that some might consider unproductive.
Rana’s advice is to distinguish between “deep” and “clear” thinking, quoting Nikola Tesla:
“One must be sane to think clearly, but one can think deeply and be quite insane.”
Absolutely, Mr. Tesla. Who hasn’t heard of the deep thinker that goes off on unchecked tangents and ends up believing the entire government is out to get him? We’ve all seen A Beautiful Mind. We all know that intelligent people, especially when working outside of their fields of expertise, can say some pretty unintelligent things.
When it gets in the way of time-sensitive action, overthinking can indeed be harmful. Buridan’s ass comes to mind — the hilarious example of a donkey, exactly equidistant from two bales of hay, who has no obvious way to decide between them. As a result, it starves to death. (What an ass.) Now, this sort of example has been used toward multiple ends, but the moral we’ll derive from it here is that sometimes, you just need to stop thinking and make a choice.
Again, absolutely agreed. If you’re a commander of a submarine that’s being attacked, you don’t have time to think about all of the motivations of the attackers, or the emotional states of your subordinates. You need to take action to defend yourself and fight back.
But honestly, how often are any of us are in situations where hesitation can cost lives? Everything is so fast-paced these days, it may seem like you absolutely have to write that Reddit comment now, but nobody is going to die if you take a few extra minutes to really think about what you’re writing (or at the very least, a few extra seconds to make sure you used “your” and “you’re” correctly).
Pulling back from extreme examples, there is another instance where overthinking can be harmful: When it is causing overwhelming distress to the thinker or those around them. This may be a subclass of “time-sensitive action” scenarios, but deserves mention nevertheless. I’ve been in this situation many times myself. Overthinking in relationships, for example, can be harmful. Particularly when the topic is of significant emotional importance to you, it can be very helpful to step away from the topic for a while. Meditation, particularly mindfulness meditation, is often good advice in such cases.
However, in these cases, I would argue that what’s going on is not actually something that’s usefully described as ‘thinking.’ Instead, it’s often better described as ‘rumination,’ ‘worrying,’ or other activities that should be distinguished from what philosophers do on a daily basis.
Furthermore, sometimes we need to push through uncomfortable thoughts and tolerate discomfort. Difficult questions require difficult thinking to get through. Think of students trying to learn how to do long division for the first time. If they gave up every time they were faced with any discomfort, they would never learn anything complex. So there are two types of scenarios here: Scenario A, where the amount of discomfort caused by thinking is unlikely to lead to anything productive, or is likely to lead to overwhelming distress or anxiety, and scenario B, where the discomfort is a necessary evil.
That’s where Tesla’s quote is most useful. Clear thinking is, indeed, preferable to deep thinking — but only when that deep thinking is unnecessarily deep. And when we’re in heightened emotional states, our ability to think clearly is impacted. But none of that is of any use if we can’t tell the difference while we’re in the midst of thinking. How can we ever know whether we are in scenario A or B?
One practical tip is to ask a trusted friend, who is not emotionally invested in the topic that is troubling you, to help verify your reasoning. At times like this, we may be tempted to ask those who tend to have the same beliefs as we do. While such friends may be able to empathize with you more, it’s also important to ask those who might offer different, even unsympathetic perspectives. Otherwise, we risk invoking confirmation bias.
Overthink more often
So there are, indeed, scenarios where overthinking things can be harmful. But what we have to realize is, scenario A is much less common than scenario B. And this is where articles like Rama’s cross the line into being harmful. Without specific, actionable guidelines for when it is advisable to stop thinking, it breeds an anti-intellectualism of which our society already has too much. Again, I am not saying that there are situations where action must be prioritized over thought. Nor am I saying that there are people who suffer from very real issues of anxiety when made to think about certain topics. I am arguing that the balance between action and thinking, in the general public, is shifted too far in favor of the former rather than the latter.
If we want to make our ideas clear, perhaps a good place to start is a piece called “How to Make Our Ideas Clear,” by C.S. Peirce. Here, Peirce introduces what came to be called the ‘pragmatic maxim’:
It appears, then, that the rule for attaining the third grade of clearness of apprehension is as follows: Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.
Although this maxim can be interpreted in different ways, one reading is characteristically consequentialist: if we want to be clear about something, then we should only concern ourselves with things that actually have consequences. “A-ha,” says the scientist, “I’ve got you there! Silly philosophers spend too much time on things that are of no consequence.”
But how do we know what those consequences are in the first place, particularly when we don’t have the luxury of performing experiments? The answer, unfortunately, is that someone has to think about it. The decision to think about whether a certain type of thinking is worthwhile, is itself the kind of deep thinking that may or may not produce results — the “loops of thought” that Rana says we fall into in “day to day life”, and the deep thinking he says we are “programmed” to engage in.
But that’s wrong: we are not, for the most part, programmed to engage in deep thought. It’s more accurate to say that we’re “programmed” to stop once we find arguments we agree with. Many of us already know about confirmation bias, but a recent body of scientific literature is arising around the closely related myside bias. As Mercier and Sperber argue in their recent book “The Enigma of Reason” (which I highly recommend reading), the fact that we reason so poorly is a feature, not a bug: It is a product of our evolution as social creatures, and when we engage in a specific type of group reasoning, we tend to produce better results.
Some might see Mercier and Sperber’s work as a justification to stop thinking so much. If our natural ability to reason is flawed, why should we keep using it? Instead, what we should take away from their work is this: if you carry out facile reasoning and reach a conclusion that you are happy with, you shouldn’t stop there. Your satisfaction with that conclusion may just be the myside bias at work. Instead, you should constantly try to find counterarguments and evaluate your position in light of them. In other words, think more.
The solution to widespread myside bias is not to discourage thinking. The solution is to encourage thinking, even when it’s not clear whether that thinking will produce immediately gratifying results that are beneficial to our daily lives. It’s better to encourage thinking, particularly the kind that helps us break free of these biases and produce better, truth-oriented argumentation.
“Oh, you must be fun at parties”
Okay, what about this stereotype: You’re having a fun, lighthearted conversation about a guilty-pleasure show you enjoy watching (full disclosure: I watched Jersey Shore). Then, out of the shadows, emerges the overthinker. Like a moth to a flame, he senses your joy and vows to stop it, seeing himself as a hero who will bring truth to the masses.
So overthinking and being right is bad, right? Actually, the takeaway here is not that the overthinker thought too much. It’s that he didn’t think enough. If he had, he might have come to the conclusion that one can believe that Jersey Shore is somewhat…less than intellectual, but it does not follow that his interrupting of the conversation was necessary, appropriate, or productive. It’s entirely possible, in almost every possible scenario, to have different beliefs from a group of people, and still not be a jerk to them. You don’t always need to correct others.
Again, how does one know the difference? Here are some quick rules of thumb, inspired by Peirce’s maxim: Ask yourself, what are your goals are in taking this action? Perhaps it is to change their minds. Then ask, will this action actually result in achieving that goal, or am I just doing it for some sense of self-satisfaction? Walking into the middle of their conversation, guns a-blazin’, may not actually achieve the goal of winning their hearts and minds. And finally, is this goal, balanced by its chances of success via this action, worth the cost to achieve it? The overthinker’s mistake is that he failed to consider all of the consequences of his action: He might indeed change their minds about Jersey Shore, but might also be disinvited from the next house party. Is it worth it?
To conclude, now for some self-reflection: Is this article itself an instance of an overthinker trying to ruin someone’s fun? Why couldn’t I just let Rana’s article exist without writing this monstrosity?
The answer is simple. I believe that shaming overthinkers, and the anti-intellectualism behind it, is extremely harmful and one of the root causes of many of the problems we face today: the current state of American politics, the underfunding of public education, and countless others. So, even though I may get some hate for this, the point is worth making. And in the interest of overthinking, I encourage further discussion on this topic, especially if it is in the form of sound arguments that force me to revise my view.