Charles C.W. Cooke opened up a can of worms at NRO when he opined, on 7 May, that asking 8th graders to prove to themselves whether the Holocaust had actually happened was a perfectly good exercise in academic honesty. He certainly has no doubts about the reality of the Holocaust, he stressed. But he also thinks we shouldn’t fear the outcome of an empirical investigation into the question, which can be compared with debates over value questions like whether “war is always wrong,” or whether “sharia law would be good for the West.”
Jonah Goldberg responded promptly, making the main points that occurred immediately to me. First, 8th graders are not college students; it’s the rare 8th grader who is up to the rigors of an Oxford Union debate. Second, debating a factual point that’s subject to empirical disproof is a different game from debating what our moral values should prescribe for us (or, indeed, what those values should be).
Cooke acknowledged both points. He still thinks we shouldn’t reflexively object to a research-and-reasoning exercise on the reality of the Holocaust. He compares it in his response to a similar exercise treating the reality of the Declaration of Independence. In his view, it would be an opportunity to stress the value of empiricism and the use of critical thinking skills.
And while he doesn’t state it explicitly, an implied advantage of doing this with a topic like the Holocaust is that a student’s success would be measurable. Since we know there was a Holocaust, and we know a great deal about it, a student’s performance will be indicated by whether he concludes that the Holocaust most certainly happened.
Right? Well, that’s where the elevation of abstract reasoning over all else breaks down for me. What about the student who doesn’t conclude that the Holocaust happened, but can make a compelling argument – in the abstract – that doubt, at least, should be cast on it; and can even “show his work,” very possibly running logical rings around the instructor?
The question about this scenario is not whether we should “fear” it. The question is what good it has done to put time and effort into it. (There’s also a question whether putting effort into it is actively harmful. More on that in a minute.)
Not every intellectual freedom that we have an inherent right to is a priority for education. A 13-year-old could congratulate himself, for example, on demonstrating that there’s no objective proof of his existence; and from the standpoint of intellectual freedom, we’d all say, “Knock yourself out, buddy.” But whether we want him to devote his educational time to that effort, and what we expect him to get out of it, depend on our scale of values. Education is intrinsically a process in which we make value-based choices about what to emphasize and what to do. No such thing exists as abstract, non-value-weighted education.
There’s thus a big value difference between giving children the tools of thinking, and selecting things for them to approach with skepticism. The latter can never be a neutral action, in any context.
In recent decades, in fact, much of academia has made something of a fetish of it for very specific purposes, like casting doubt on whether America’s Founders were Christians or not. The upshot of this has been that “skepticism” itself has been turned on its head, and middle- and high-school students are encouraged to think they’ve been endowed with a refreshing skepticism on one topic or another, when in fact they’ve simply been indoctrinated with an arguable falsity.