One of the points of his defense is that post-modernists are infected with (among other things) emotion and that their intellectual positions are therefore subjective, whereas scientists are objective, and their intellectual positions are therefore not subjective.
Then comes this blog post from Mother Jones, in which the writer, in an attempt to explain why many people "don't believe science," argues that the reason is that many people's reasoning is inextricably bound up with emotion and prejudice:
... an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.Now there are several interesting things about this supposed "finding." The first is that it is yet another example of what I call "Duh" Science, which consists of studies that tell us what we didn't need a study to tell us. In this case, we are told that people are not completely objective in the process of forming their opinions.
The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
Gee, who'd o' thunk it?
But applying multi-syllabic Latinate words to this truth does yield that warm, fuzzy science feeling, I'll have to admit: "The 'cultural cognition of risk' refers to the tendency of individuals to form risk."
Then there is the tendency in the discussion of this study that itself displays how the finding applies to those who are touting the study:
So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and numerous Hollywood celebrities (most notably Jenny McCarthy and Jim Carrey).This is really the only case they can find among the lefties? Isn't that convenient. This problem of emotional and prejudicial opinion profiling turns out to be mainly a problem on the right . "No one here but us objective liberals." Add this to the list (Popperian falsifiability, the positivist verifiability criterion) of principles that apply to everyone but those to spout them.
But, getting back to my friend defense of positivism, it seems to me that, if the findings of this study are correct, it explodes his hypothesis that scientists are objective, since the finding that there is a "tendency of individuals to form risk perceptions that are congenial to their values" would seem like it should apply to positivists as much as everyone else. And if the positivist types who promote such studies themselves then exempt most of their own beliefs from the findings of their own studies, they have unwittingly provided further evidence for their thesis, making the positivist position even more untenable.