Tyler Cowen Never Fails to Disappoint: His Evolving Views on Climategate
For those of you who don’t know, Tyler Cowen is an economics professor at George Mason University. When I realized he had a popular blog, I was very excited because GMU students (whom I liked) had told me, “Tyler is the smartest libertarian alive” and things of that nature. So I was preparing to be dazzled.
Well, just like Star Wars Episode I, Tyler did not live up to my impossible expectations. So I really think that explains why I get so frustrated with him.
Anyway, one of the cute things about Tyler is that his opinions are as unstable as global mean temperatures. Take Climategate for instance. When the scandal first broke Tyler said:
I’ve had many readers emailing me, asking what I think of the “trove” of emails unearthed from climate change researchers. I’ll admit I haven’t read through the rather embarrassing revelations, I’ve only read a few media summaries and excerpts. I see a few lessons:
1. Do not criticize other people in emails or assume that your emails will remain confidential, especially if you are working on a politically controversial topic. Ask a lawyer about this, if need be. “Duh,” they will say to you.
2. The Jacksonian mode of discourse, or mode of conduct for that matter, can do harm to your cause, especially if you are otherwise trying to claim the scientific high ground.
The substantive issues remains as they were. In Bayesian terms, if it turns out that many leading scientists do not practice numbers one and two, I am surprised that you are surprised. It’s very often that the scientific consensus “sounds that way.”
In other words, I don’t think there’s much here, although the episode should remind us of some common yet easily forgotten lessons.
I should add that this episode will seem very important to you, if you conceive of the matter in terms of the moral qualities of “us vs. them.”
OK so the Bayesian response is that this shouldn’t affect your opinion of the underlying science of climate change.
Oops that’s not quite right. After further review, today Tyler now says:
Good vs. evil thinking causes us to lower our value of a person’s opinion, or dismiss it altogether, if we find out that person has behaved badly. We no longer wish to affiliate with those people and furthermore we feel epistemically justified in dismissing them.
Sometimes this tendency will lead us to intellectual mistakes.
Take Climategate. One response is: 1. “These people behaved dishonorably. I will lower my trust in their opinions.”
Another response, not entirely out of the ballpark, is: 2. “These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for. I will revise upward my estimate of the seriousness of the problem.”
I am not saying that #2 is correct, I am only saying that #2 deserves more than p = 0. Yet I have not seen anyone raise the possibility of #2. It very much goes against the grain of good vs. evil thinking: Who thinks in terms of: “They are evil, therefore they are more likely to be right.”
(Which views or goals of yours would you behave dishonorably for? Are they all your least correct views or least important goals? With what probability? Might it include the survival of your children?)
I do understand that this line of reasoning can be abused: “The Nazis went to a lot of trouble, etc.” The Bayesian point stands.
So now–if you were a sloppy reader–you would come away thinking the Bayesian response is to increase your estimate of the probability of the severity of manmade climate change.
Now clearly something must have happened to make Tyler change his initial reaction–this shouldn’t affect your opinion of the science at all–to his new reaction: This could very well mean you were previously underplaying the significance of the threat.
I wonder which of the CRU emails made him change his mind? (A Bayesian can only change his mind when new evidence comes in, to make him update his priors.) Remember, the sophisticated Tyler thought people were naive for not realizing how saucy academics could be in the first reaction.
So what method of dishonorable behavior shocked even Tyler–which he read about after his first post–that made him realize this was unusual indeed, and that maybe the problem was more severe than the average person would have initially thought?
I cannot rule out with 95% confidence the null hypothesis that Tyler wakes up every morning and says, “How can I annoy Bob Murphy today?”
UPDATE: Okay I put my finger on something else that really bothers me about Tyler’s latest. It’s not as if these scientists (especially the RealClimate guys) in the spotlight have been meek about telling the public how serious the problem is. So for Tyler’s story to work, the Bayesian would have to reason like this:
“Wow! Before when these scientists were telling me it was urgent that we stabilize CO2 concentrations at x ppm or else our grandchildren were going to drown, I thought they were lying. But now that I see they are willing to delete emails, bully journal editors, and tinker with graphs to remove doubts in the mind of the reader, I no longer think they were lying about how serious the problem is.”
Last point: What is also annoying is that I think Tyler plays favorites. He is a really sharp guy and he uses his creativity to invent (what are in my opinion) absurd rationalizations for things that often increase State power. That per se doesn’t make him wrong, it just annoys me because I think he’s inconsistent and flippant on things that have serious consequences. For example, suppose after the waterboarding stuff broke someone on Fox News had argued, “You see, the critics think this has all been about Bush funneling money to Halliburton, but it really is about protecting Americans. Do you really think they would waterboard someone 100+ times, if they didn’t really think they were saving lives?” Now would Tyler have blogged about what a great epistemologist this hypothetical commentator was? I seriously doubt it.