You oughta know | the shortened version
A shortened version of a paper forthcoming in Episteme (an epistemology journal)
Please like, share, comment, and subscribe. It helps grow the newsletter and podcast without a financial contribution on your part. Anything is very much appreciated. And thank you, as always, for reading and listening.
It is difficult to get a man to understand something when his salary depends upon his not understanding it.
—H. L. Mencken
The Bible is very easy to understand. But we Christians are a bunch of scheming swindlers. We pretend to be unable to understand it because we know very well that the minute we understand, we are obliged to act accordingly.
—Søren Kierkegaard
Some people act immorally and know it. Others know what morality requires and manage to follow through. But there's a third group — the focus of this essay — who suspect that learning more would undercut their conscience. So they stay in the dark. Not because they’re indifferent, but because they know themselves too well. They don’t want to know what they suspect will obligate them to fix their behavior and accept their past actions were wrong. Philosophers and psychologists call this (a kind of) strategic ignorance.
This post distills the main argument in a soon to be published research article of mine1: there are situations where individuals are morally obliged to acquire knowledge. Not simply for the virtue of being better informed, but because gaining that knowledge would morally motivate them to act better and that would be too costly. In such cases, they know the salient knowledge would be so motivating—this awareness is partly the source of they culpability, as we discuss in a bit. Let’s call this possibility: morally mandatory knowledge.
The Ethics of Avoidance
Consider Erica. She is fond of her designer heels — made by Jimmy Choo — but suspects that unethical sweatshops are involved. She avoids the articles, tosses the pamphlets, and changes the subject when friends bring it up. She’s not ignorant by accident. She’s strategically ignorant. Why? She (rightly) suspects that if she really learned what went into making her favorite shoes, she’d feel obligated to stop buying them and recognize the role her financial support.
Erica isn't a sociopath. She’s not indifferent. In fact, it is because she has moral sensitivities that she wants to remain ignorant. She is trying to avoid the internal discomfort — the cognitive dissonance and subjective ambivalence (Connor & Armitage 2008) — that result from knowing too much while doing too little.
Erica is not alone. Studies show that many consumers — especially in affluent societies — avoid learning about things like meat production or labor exploitation precisely because they worry that such knowledge will compel costly moral change on their part (Onwezen & van der Weele 2016). These ‘caring consumers’ are distinct from indifferent ones. They’re sensitive to the suffering they cause. And that’s the problem. Because they’re so sensitive, they know that certain facts could be morally transformative that costs them time, money, and convenience, they look away when they suspect they may become aware.
Two Conditions for Moral Obligation
The heart of the argument is simple. Some people have a moral obligation to acquire certain knowledge, even if they haven’t acquired it yet. But that obligation only arises under two conditions:
Awareness Condition (AC) – The person suspects (correctly or reasonably) that acquiring the knowledge would psychologically motivate them to act morally.
Control Condition (CC) – The person has access to the knowledge. They’re capable of getting it — even if it requires effort or self-overcoming. There isn't some real, either moral or practical, which overrides such obligations.
If both conditions are met, strategic ignorance isn’t just a passive failing — it becomes a form of moral negligence, where one lacks knowledge they have a moral obligation to acquire. And a refusal to confront what you ought to know., such that it will motivate moral action.
Take Winston. He enjoys meat but suspects that eating animals is wrong. He avoids vegan friends. Skips documentaries. Doesn’t want to hear about factory farming. He’s psychologically constituted such that if he let the arguments in, he’d probably stop eating meat. So he doesn’t let them in.
Winston satisfies both AC and CC. He’s aware that learning more would change him. And he has access to the relevant moral knowledge. That’s enough to say he has a moral obligation to investigate. His ignorance isn’t neutral. It is self-serving and morally indefensible. There are many such examples—choose your own!
How Much Can You Know and Still Deny?
One challenge here — the too much knowledge objection — is that people like Erica and Winston seem to know enough to suspect their actions are wrong, and enough to avoid the facts. But doesn’t that undermine their plausible deniability? Doesn’t it make them culpable already, even if they lack full knowledge?
Not quite. Strategic ignorance is an epistemic and psychological gray zone — one where a person suspects just enough to avoid full commitment. Simler and Hanson (2018) call this the ‘sweet spot’ of self-deception: people can harbor suspicions without fully confronting them, especially when further investigation would create costly obligations.
This is what makes strategic ignorance morally sneaky. It preserves just enough distance from knowledge to maintain plausible deniability — both to others and to oneself. But if the person has both access and awareness, then the ignorance is chosen — and choosing ignorance, when knowledge would morally motivate you, is blameworthy.
Why We Protect Our Ignorance
Why are people like Erica and Winston moved to avoid morally salient information? There are at least two main reasons:
Moral Reputation – People care what others think of them. Strategic ignorance helps preserve a public image. And if you didn’t know, you’re harder to blame (Zimmerman 2008).
Moral Identity – People want to see themselves as good. They avoid morally disruptive knowledge because they want to maintain a positive self-conception. If I don’t know, I can keep telling myself I’m a decent person.
These motives explain why people sometimes go out of their way not to know things they could, and often should, know.
Is Ignorance a Moral Defect?
The philosopher Duncan Pritchard argues that ignorance is more than a lack of knowledge — it results from epistemic failure (2021). If someone should have known something, and they didn’t, we rightly say they’re ignorant. It is not a compliment.
In the same way, strategic ignorance can be a moral failure. If someone should have known that their actions were wrong, and chose to stay in the dark — especially knowing, or at least reasonably suspecting, that learning it would morally motivate them — that their ignorance is a moral defect. Barring some moral exception, they have an obligation to acquire the morally efficacious knowledge.
Conclusion—What You Oughta Know!
There are many people like Erica and Winston, who appreciate knowing more would morally obligate them, and who avoid that knowledge for precisely that reason. They are not indifferent. They are sensitive but evasive, and want to avoid the cost of reform and reflection. Their incentives drive demand for just enough ignorance for strategic purposes.
When someone meets the two key conditions — when they are aware that knowledge would move them morally, and when they have the ability to acquire it and lack a good reason not to — they have a moral obligation to acquire the salient knowledge. So, in some cases, you oughta know. (Even Alanis Morissette agrees btw👇)
References
Altay, S., Hacquin, A.-S., & Mercier, H. (2020). Why Do So Few People Share Fake News? New Media & Society, 24(6).
Connor, M., & Armitage, C. J. (2008). Attitudinal Ambivalence. Psychology Press.
Dana, J. (2005). Strategic Ignorance of Harm. In D. Moore et al. (eds.), Conflicts of Interest.
De Brigard, F., & Stanley, M. (2021). Moral Memories and Identity Protection. Psychological Inquiry, 32(4).
Grossman, Z. (2014). Strategic Ignorance and Social Preferences. Management Science, 60(11).
Onwezen, M. C., & van der Weele, C. N. (2016). When Indifference Is Ambivalence. Food Quality and Preference, 52.
Pritchard, D. (2021). Ignorance and Inquiry. American Philosophical Quarterly, 58(2).
Simler, K., & Hanson, R. (2018). The Elephant in the Brain.
Smith, H. (1983). Culpable Ignorance. Philosophical Review, 92(4).
Zimmerman, M. (2008). Living with Uncertainty. Cambridge University Press.
‘You oughta know: on the possibility of morally mandatory knowledge’: Episteme (forthcoming).
There is another thing in mind.
Let's say that learning about sweatshops would make me not want to buy something.
However, reading an economics textbook might make me decide that sweatshops are the least bad way to provide economic growth for poor third worlders, and that not buying their goods does them few favors.
Then I read some utilitarian textbook that says that I actually need to become a hedge fund manager and donate 99% of my earnings to buy malaria nets for the third world while living off spam in a shack, because that does the most good for the world.
Then I read another utilitarian author that says that actually third worlders have terrible genetics and what I really ought to be doing is trying to maximize the ROI of my investments while creating genetically enhanced babies via IVF and having them raised in orphanages to maximize future world fitness.
Then I realize that I've got a limited amount of time and fucks to give in my life to concern myself over random shit and go on living.
I do think you're onto something here, but I would keep it closer to home. Rather than grand theories about the world, many people lie to themselves about issues closer to home. Do you lie in your job? Do you lie in your community? To your family? How ignorant of your spouse do you remain so that you don't have to change your habits? These things tend to have more concrete actions that can be taken and more likely those actions will actually help.
Strategic ignorance is a great concept that reminds me of Sartre's idea of bad faith, which is about refusing to acknowledge what you know deep down to be true.