Wednesday, November 25, 2009

The Data Is Right There, In The Correction Coefficients!

Nothing to see here:

Eric S. Raymond is a software developer and advocate of the open source software movement. He wrote a seminal paper called The Cathedral and the Bazaar, which explained why open processes are more effective than top down ones. He has been studying the code used by the scientists at the Climate Research Unit at the University of East Anglia, whose work raised serious questions about the quality of the research being used to underpin the proposed $1 trillion Cap'n Trade bill stalled in Congress. Here's what Eric found in the computer code:

From the CRU code file osborn-tree6/briffa_sep98_d.pro , used to prepare a graph purported to be of Northern Hemisphere temperatures and reconstructions.

;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,- 0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
;
yearlyadj=interpol(valadj,yrloc,timey)

This, people, is blatant data-cooking, with no pretense otherwise. It flattens a period of warm temperatures in the 1940s -- see those negative coefficients? Then, later on, it applies a positive multiplier so you get a nice dramatic hockey stick at the end of the century.

All you apologists weakly protesting that this is research business as usual and there are plausible explanations for everything in the emails? Sackcloth and ashes time for you. This isn't just a smoking gun, it's a siege cannon with the barrel still hot.

Or as one of the commenters to Eric's post said:

They didn’t just cook the data; they marinated it for a week, put on a rub, laid it in the smoker for a day and a half, sliced it up, wrapped it in bacon, dipped it in batter, rolled it around in flour, and deep fried it.

American Thinker has more here.

No comments: