Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« Quote of the day | Main | Consensus-breaking »
Monday
Nov232009

The code

This is a new thread for updates on the analyses of the data and code freed from CRU.

Everybody, I'm sinking under weight of things to do here. I need you to post one or two line analyses of what you are finding in which bits of code. I'll transfer these to the main post as they come in. It needs to be in layman's language and to have a link to your work.

CRU code

  • Francis at L'Ombre De L'Olivier says the coding language is inappropriate. Also inappropriate use of hard coding, incoherent file naming conventions, subroutines that fail without telling the user, etc etc.
  • AJStrata discovered a file with two runs of CRU land temp data which show no global warming per the data laid out by country, and another CRU file showing their sampling error to be +/- 1°C or worse for most of the globe. Both CRU files show there has been no significant warming post 1960 era
  • A commenter notes the following comment in some of the code:"***** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********"
  • Good layman's summary of some of the coding issues with a file called "Harry". This appears to be the records of some poor soul trying to make sense of how the code for producing the CRU temperature records works. (rude words though, if you're a sensitive type)
  • Some of annotations of the Harry code are priceless - "OH **** THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm hitting yet another problem that's based on the hopeless state of our databases. There is no uniform data integrity, it's just a catalogue of issues that continues to grow as they're found."
  • CRU's data collation methods also seem, ahem, amusing: "It's the same story for many other Russian stations, unfortunately - meaning that (probably) there was a full Russian update that did no data integrity checking at all. I just hope it's restricted to Russia!!"
  • Borepatch discovers that CRU has lost its metadata. That's the bit that tells you where to put your temperature record on the map and so on.
  • Mark in the comments notices a file called resid-fudge.dat, which he says contains, believe it or not, fudged residuals figures!
  • Mark in the comments notes a program comment: "Apply a VERY ARTIFICAL correction for decline!! followed by the words `fudge factor' " See briffa_sep98_d.pro.
  • From the programming file combined_wavelet.pro, another comment, presumably referring to the famous Briffa truncation: "Remove missing data from start & end (end in 1960 due to decline)".
  • From the file pl_decline.pro": "Now apply a completely artificial adjustment for the decline only where coefficient is positive!)"
  • From the file data4alps.pro: "IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density' records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this "decline" has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures."
  • From the Harry readme:"What the hell is supposed to happen here? Oh yeah - there is no )'supposed', I can make it up. So I have :-)...So with a somewhat cynical shrug, I added the nuclear option - to match every WMO possible, and turn the rest into new stations (er, CLIMAT excepted). In other words, what CRU usually do. It will allow bad databases to pass unnoticed, and good databases to become bad, but I really don't think people care enough to fix 'em, and it's the main reason the project is nearly a year late. " (see Harry readme para 35.
  • James in the comments says that in the file pl_decline.pro the code seems to be reducing temperatures in the 1930s and then adding a parabola to the 1990s. I don't think you need me to tell you what this means.

  •  

PrintView Printer Friendly Version

References (3)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    Charming chap, by all accounts.Ably aided by UKian the Baroness Ashton. "Last week she was unknown in Britain. Today she is unknown all over Europe". Call Me Dave's previous "we will not let matters there" now turns out to be "I don't want an in or out...
  • Response
    From the file pl_decline.pro: check what the code is doing! It's reducing the temperatures in the 1930s, and introducing a parabolic trend into the data to make the temperatures in the 1990s look more dramatic. - Recycled to a separate posting today by ClimateGate blogstar Bishop Hill from among the comments ...
  • Response
    (Cross-posted en The North Cave) En la larga semana que ha pasado desde que los ahora famosos datos de la CRU saliesen a la luz pública en la blogosfera se ha explorado ya considerablemente los 160 megabytes de datos, gracias a los esfuerzos de decenas...

Reader Comments (99)

Climate (or, the author of this blog):

You're a sub-intellectual with just enough pretensiousness to think people want to read what you have to say; though I doubt some braindead sheep DO want to read what you have to say.

Nov 23, 2009 at 10:01 PM | Unregistered Commenterhfj

Their comments say they lost metadata and couldn't find it:

Bear in mind that there is no working synthetic method for cloud, because Mark New
lost the coefficients file and never found it again (despite searching on tape
archives at UEA) and never recreated it. This hasn't mattered too much, because
the synthetic cloud grids had not been discarded for 1901-95, and after 1995
sunshine data is used instead of cloud data anyway.

This is from README_GRIDDING.TXT.

Nov 23, 2009 at 10:31 PM | Unregistered CommenterBorepatch

@ hfj

"You're a sub-intellectual with just enough pretensiousness to think people want to read what you have to say; though I doubt some braindead sheep DO want to read what you have to say."

Your AGW shill grant coming up for renewal per chance?

Nov 23, 2009 at 10:31 PM | Unregistered CommenterJabba the Cat

Just wanted to say you are doing a great job. If I knew anything about code I would gladly spend all my free time helping.

Nov 23, 2009 at 10:34 PM | Unregistered CommenterVinnster

I've worked in computational chemistry for a while... this is not surprising to me. They are scientists, not software engineers (and they think they are above asking a software engineer for help.)

Nov 23, 2009 at 10:38 PM | Unregistered Commenterlukas

My Dear Bishop, I imagine you will leave the opening comment up. It is perfectly illustrative of the cast of mind of one strand of AGW opinion: shrill, adolescent anger. I suspect it is there, half buried, in some allegedly scientific types.

Nov 23, 2009 at 10:46 PM | Unregistered CommenterJeff Wood

Jeff

I wouldn't dream of snipping a work of art.

Nov 23, 2009 at 11:30 PM | Registered CommenterBishop Hill

This comment pops up in a few of the source files:

;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

:-D

Nov 24, 2009 at 3:58 AM | Unregistered CommenterThortung

Very good. I discovered a file with two runs of CRU land temp data which show no global warming per the data laid out by country, and another CRU file showing their sampling error to be +/- 1°C or worse for most of the globe. Both CRU files show there has been no significant warming post 1960 era

Link: http://strata-sphere.com/blog/index.php/archives/11420

Nov 24, 2009 at 4:23 AM | Unregistered CommenterAJStrta

"You're a sub-intellectual with just enough pretensiousness to think people want to read what you have to say;..."


Would that be pretentiousness, Einstein?

Nov 24, 2009 at 4:36 AM | Unregistered CommenterSteamboat McGoo

is "hfj" Frank O'Dwyer gone bad?? Whoever it is, thanks for the self-parody. You can't buy laughs like that.

Nov 24, 2009 at 8:30 AM | Unregistered CommenterSebastian Weetabix

I dunno about you guys, but I'm gonna open a used hockey stick shop... just imagine when all this Al-Gore-rithm shit hits the fan, the lines of angry Americans lining up to buy one!

Wonder who would get pucked.

Nov 24, 2009 at 12:34 PM | Unregistered CommenterJonesy

Wikipedia articles that should record this event in a neutral and balanced manner, with journalist-written sources (best to discuss on the talk page, don't just start editing directly):

http://en.wikipedia.org/wiki/Climatic_Research_Unit_e-mail_hacking_incident
http://en.wikipedia.org/wiki/Phil_Jones_(climatologist)
http://en.wikipedia.org/wiki/Michael_E._Mann
http://en.wikipedia.org/wiki/Hockey_stick_controversy

Nov 24, 2009 at 12:38 PM | Unregistered Commentergravamen

See this email conversation involving Ian (Harry) Harris, Tim Osborne, and Phil Jones -- another great example of "consensus" science:

http://www.eastangliaemails.com/emails.php?eid=1009&filename=1252090220.txt

If this isn't cooking the books, then I don't know what is. Since when does scientific data have to "look good" and scientists need to "be happy with the version we release"? Also, what the hell is a "IDL thingummajig"? Some magic toaster used to make climate change guano?

Just report the facts, that's all we ask.

Nov 24, 2009 at 12:56 PM | Unregistered CommenterPaul Z.

When the whole truth finally comes out, I'll bet we find that "hacking" wasn't involved.

It had to have been either an insider (with a couple of login IDs) swiping (or even collecting and zipping the contents of) the 62 mb zipfile, or someone accidentally finding it on an unprotected server. Subsequent posting onto RC and the Russian server was done through an anonymous server - a simple technique widely used by lots of (non-hacker) folks to obscure their identity.

But CRU and RC will continue to spin it as "hacking" because its to their advantage to do so...

Nov 24, 2009 at 1:01 PM | Unregistered CommenterSteamboat McGoo

Gravamen, it's really hard to edit wikipedia articles related to climate change because the AGW have an army of wiki editor gatekeepers to come up with some bullshit reason to remove what you've written. I've experienced this so many times. Just try writing something on the IPCC or Al Gore pages, you'll see what I mean.

A better strategy is write a google knol on the subjects you have outlined @:
http://knol.google.com/k

No editorial censorship from AGW zealots.

Nov 24, 2009 at 1:03 PM | Unregistered CommenterPaul Z.


But CRU and RC will continue to spin it as "hacking" because its to their advantage to do so...

I believe you are absolutely correct. I have been working in the computer science field for almost 30 years now, and this does not smack me as being "hackers" (I used to be one).

I am not buying into a word that Gavin Schmidt says about it. I do not believe RC was ever "hacked" or that there was even an attempt made. It simply doesn't make sense to do so and would have been a waste of time with unnecessary exposure and risk. Just doesn't make any sense.

RC is simply trying to play the victim card, in a broad CYA attempt. Its simply a diversionary tactic.

Nov 24, 2009 at 2:56 PM | Unregistered CommenterSquidly

Experimental results of running the CRU code* here.

*Tongue firmly planted in cheek.

Nov 24, 2009 at 4:44 PM | Unregistered CommenterBorepatch

More gems from the code:

From the programming file called "briffa_sep98_d.pro":

yyy=reform(compmxd(*,2,1))
;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
;
yearlyadj=interpol(valadj,yrloc,timey)
;

November 24, 2009 | mark
From the programming file "combined_wavelet.pro":

restore,filename='combtemp'+regtit+'_calibrated.idlsave'
;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)

November 24, 2009 | mark
From the programming file "testeof.pro":


; Computes EOFs of infilled calibrated MXD gridded dataset.
; Can use corrected or uncorrected MXD data (i.e., corrected for the decline).
; Do not usually rotate, since this loses the common volcanic and global
; warming signal, and results in regional-mean series instead.
; Generally use the correlation matrix EOFs.
;

November 24, 2009 | mark
From the programming file: "pl_decline.pro":

;
; Now apply a completely artificial adjustment for the decline
; (only where coefficient is positive!)
;
tfac=declinets-cval

November 24, 2009 | mark
From the programming file "olat_stp_modes.pro":

;***TEMPORARY REPLACEMENT OF TIME SERIES BY RANDOM NOISE!
; nele=n_elements(onets)
; onets=randomn(seed,nele)
; for iele = 1 , nele-1 do onets(iele)=onets(iele)+0.35*onets(iele-1)
;***END
mknormal,onets,pctime,refperiod=[1922,1995]
if ivar eq 0 then begin
if iretain eq 0 then modets=fltarr(mxdnyr,nretain)
modets(*,iretain)=onets(*)
endif
;
; Leading mode is contaminated by decline, so pre-filter it (but not
; the gridded datasets!)
;

November 24, 2009 | mark
From the programming file "data4alps.pro":

printf,1,'IMPORTANT NOTE:'
printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
;

November 24, 2009 | mark
From the programming file "mxd_pcr_localtemp.pro"

;
; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don't do special PCR for the modern period (post-1976),
; since they won't be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are "manually" removed because
; they are isolated and away from any trees.
;

November 24, 2009 | mark
From the programming file "calibrate_mxd.pro":

;
; Due to the decline, all time series are first high-pass filter with a
; 40-yr filter, although the calibration equation is then applied to raw
; data.
;

November 24, 2009 | mark
From the programming file "calibrate_correctmxd.pro":

; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.

November 24, 2009 | mark
From the programming file "mxdgrid2ascii.pro":

printf,1,'NOTE: recent decline in tree-ring density has been ARTIFICIALLY'
printf,1,'REMOVED to facilitate calibration. THEREFORE, post-1960 values'
printf,1,'will be much closer to observed temperatures then they should be,'
printf,1,'which will incorrectly imply the reconstruction is more skilful'
printf,1,'than it actually is. See Osborn et al. (2004).'
printf,1
printf,1,'Osborn TJ, Briffa KR, Schweingruber FH and Jones PD (2004)'
printf,1,'Annually resolved patterns of summer temperature over the Northern'
printf,1,'Hemisphere since AD 1400 from a tree-ring-density network.'
printf,1,'Submitted to Global and Planetary Change.'
;

November 24, 2009 | mark
From the programming file "maps24.pro":

;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
if n_elements(yrstart) eq 0 then yrstart=1800
if n_elements(doinfill) eq 0 then doinfill=0
if yrstart gt 1937 then message,'Plotting into the decline period!'
;
; Now prepare for plotting
;

November 24, 2009 | mark
From the programming file "calibrate_correctmxd.pro":

;
; Now verify on a grid-box basis
; No need to verify the correct and uncorrected versions, since these
; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
;

November 24, 2009 | mark
From the programming file "recon1.pro":

;
; Computes regressions on full, high and low pass MEAN timeseries of MXD
; anomalies against full NH temperatures.
;
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
perst=1881.
peren=1960.
;

November 24, 2009 | mark
From the programming file "calibrate_nhrecon.pro":

;
; Calibrates, usually via regression, various NH and quasi-NH records
; against NH or quasi-NH seasonal or annual temperatures.
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
perst=1881.
peren=1960.

November 24, 2009 | mark
From the programming file "briffa_sep98_e.pro":

;
; PLOTS 'ALL' REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry's regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; "all band" timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
;

Nov 24, 2009 at 6:37 PM | Unregistered Commentermark

Hmm.

It's so difficult to make judgement calls on this issue but I do welcome the debate. I am rather sick of the high profile deniers, and to be honest I am occasional revolted by green scientists not being able to discuss the remote possibility of some (minor or possibly major) issues with the data.

Surely having more information in the public domain cannot be a bad thing. We paid for the research I'd like to see a little bit more than a graph at the end of it

Nov 24, 2009 at 7:35 PM | Unregistered CommenterLed

Surely the most damning single line of code in the whole archive:

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor

Adjustments applied to 5-year averaged data I take it. Push down the 40s, boost the latest.

Amazing.

Nov 24, 2009 at 7:36 PM | Unregistered CommenterMolon Labe

There's a data file at mbh98-osborn.zip\mbh98-osborn\TREE\COMPARE called, believe it or not,

resid-fudge.dat

which indeed appears to be fudged residuals from tree ring data by year.

Nov 24, 2009 at 8:39 PM | Unregistered Commentermark

Thanks to everyone devoting time to the files.

Nov 24, 2009 at 8:43 PM | Unregistered CommenterSpartan

Molon Labe

Can you point us to the file please. Looks important.

Nov 24, 2009 at 9:00 PM | Registered CommenterBishop Hill

FORTRAN????

I haven't seen FORTRAN for at least 30 years! When I did that programming it was on punchcards. Obviously, "Climate Playstation Software Designers" use it. This makes no sense, unless you are trying to be difficult to understand. I am working on my advanced degree I could not even find a Fortran compiler on campus. After digging my old programming books out of the mothballs I started looking at this cryptic spaghetti code, and it makes my head hurt. But the comments in the code make me very angry. I guess that those comments were just off the cuff remarks, right.

To the AGW true believers please find a less measurable fixation to,latch onto. I might suggest nicely shaped quartz crystals, alien abduction, Phrenology, or maybe Scientology. Your belief system is no longer viable as an intact cogent system. It is similar to the Enron fiasco, once the door is open and the light let in, it will never be the same.

Nov 24, 2009 at 9:02 PM | Unregistered CommenterRichard Percifield

To Bishop Hill: the file mentioned by Molon Labe & in my earlier post is "briffa_sep98_d.pro"

BTW same code appears in the followup program "briffa_sep98_e.pro"

Nov 24, 2009 at 9:12 PM | Unregistered Commentermark

Bish, in case Molon Labe has gone to bed, Google gives:

\FOIA\documents\osborn-tree6\briffa_sep98_d.pro

Nov 24, 2009 at 9:16 PM | Unregistered Commenteradamskirving

@Richard: Meh. Sigh. Don't bash the use of FORTRAN, which is still alive and well, thank you. I even have a compiler running at the moment. Instead look at the enormous programming atrocities found in the code.

@Molon Labe: Yep. I found that too.

And if somebody wants to run IDL scripts: there is an GNU version called GDL. see http://gnudatalanguage.sourceforge.net/.
F77 compilers are in all Linux distro's, I could not find a free F90 compiler which works. I have a DEC F90 compiler running, but that one isn't free and runs only under WIN2K.

Pjotr

Nov 24, 2009 at 9:16 PM | Unregistered CommenterPjotrk

Thanks Mark.

Nov 24, 2009 at 9:20 PM | Registered CommenterBishop Hill

I'm sorry. I was not trying to take credit for finding that. I culled it from the earlier post that Mark made.

Nov 24, 2009 at 9:24 PM | Unregistered CommenterMolon Labe

Hello Pjotrk,

I apologize if it appeared that I was bashing fortran, I have fond memories of the language. However, I do not have a compiler for fortran, and have not rebuilt my Linux box since the surge hit (splat). Unlike others I have only done a cursory look at the code, and may take a harder look over the holidays. I am a Hardware Engineer, and generally stay away from code (Unless it is assembly, I know I am a bit level control freak). What is even more scary is I was a meteorology student 30 years ago.

I would have thought that with the processor intensive simulations to be done that more advanced coding options would have been used. So to all that have reviewed the code I say thanks.

Nov 24, 2009 at 9:33 PM | Unregistered CommenterRichard Percifield

(I'm about to leave the house for a violent session of "hide the beer", so no details, sorry) but a quick run of:
> grep -iR artificial FOIA/documents/

is extremely enlightening.

I dunno what you windows users use for recursively searching for case-insensitive strings, but I truly loves me my /usr/bin/grep

Nov 24, 2009 at 9:43 PM | Unregistered CommenterThe Dread Pirate Neck Beard

Why do they hate Nepal??

mkregions.pro
;
; Remove Nepal from them all too, individually **********************
;
;keeplist=where(ylat ge 30.,maxtrees)
;xlon=xlon(keeplist)
;ylat=ylat(keeplist)
;
; Define arrays
;
maxtrees=n_elements(xlon)
regname=['ALL','NORTH_OF_60','SOUTH_OF_60','WNA','ENA','CER','NEUR','SEUR',$
'ARCTIC','EXTRA-ARCTIC']
nreg=n_elements(regname)
ntree=intarr(nreg)
treelist=fltarr(maxtrees,nreg) & treelist(*,*)=!values.f_nan
;
; 1) All (except Nepal)
;
i=0
keeplist=where(ylat ge 30.,nkeep)
ntree(i)=nkeep
treelist(0:nkeep-1,i)=keeplist(*)

Nov 24, 2009 at 10:30 PM | Unregistered CommenterHPX

"I guess that those comments were just off the cuff remarks, right."
In frustrato, veritas.

Nov 24, 2009 at 10:58 PM | Unregistered Commenterdearieme

I am rather sick of the high profile deniers, and to be honest I am occasional revolted by green scientists not being able to discuss the remote possibility of some (minor or possibly major) issues with the data.

Led, this might be a good time to expunge the term "denier", which was always used pejoratively, and now appears to have been used largely for people like McIntyre who seem to have been proven right.

Nov 24, 2009 at 11:09 PM | Unregistered CommenterCharlie Martin

@hfj

Do you have a proper response or just name calling?

[snip - let's not let this get out of hand]

Nov 24, 2009 at 11:37 PM | Unregistered CommenterCorrugated Soundbite

Foxnews is not really picking up the story. Sure, they had it on Beck, but that's only part of the day. Is the reason so many news outlets who WOULD pick up on this story NOT running it because they simply don't understand it yet? Or have they decided to drop the ball too?

Nov 25, 2009 at 12:48 AM | Unregistered Commenterslayer

Another gem from briffa_sep98_decline1.pro:

;
; On a site-by-site basis, computes MXD timeseries from 1902-1976, and
; computes Apr-Sep temperature for same period, using surrounding boxes
; if necessary. Normalises them over as common a period as possible, then
; takes 5-yr means of each (fairly generous allowance for
; missing data), then takes the difference.
; Results are then saved for briffa_sep98_decline2.pro to perform rotated PCA
; on, to obtain the 'decline' signal!
;

Nov 25, 2009 at 12:49 AM | Unregistered Commentermark

Fortran is still the normal language to work with in many scientific circles. You join a research team, and everyone else uses Fortran, so you do too. I did an applied math Ph.D, and there was a division between those who had taken some Comp Sci courses as undergrads (often preferred to use more modern languages) and those without, who picked up Fortran from their elders in the field along the way. I was in the former category, and my models tended to be written in C/C++ with lots of calls to Fortran libraries. After a while, even people like me end up writing more Fortran for the sake of consistency and/or ease of swapping things with other people. Fortran was designed by and for mathematicians, but a long time ago. As such, it incorporates various mathematical conventions that more modern languages tend not to. Plus you can write surprisingly low level code with it if you really want to, and are trying to get the maximum performance out of the hardware and know what you are doing.

So I have no real problem with these people using Fortran. This is to be expected. As others have said, it is not an ideal language for things like text processing, but that isn't really the point. Most of these people are unlikely to have seen anything else. Comp Sci and Applied Math are different worlds. The efficiency of these people's text processing code is not really the point. It's the mathematical code that matters. A lot of it seems very sloppy at best.

However:

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor

OMG

Nov 25, 2009 at 1:31 AM | Unregistered CommenterMichael Jennings

Compare the two files resid-fudge.dat & resid-best.dat, both located in mbh98-osborn.zip\mbh98-osborn\TREE\COMPARE\

All the numbers from 1700 onwards are identical, all calculated to SIX decimal places.

But in resid-best.dat, the years 1961 onward all have "0.24". 1960 is the year, I think, that they talked about hiding the decline in some of the program files. The same numbers, plus the sudden change to 2 decimal places after 300 years, is a clear indication of data being artificially manipulated.

resid-best.dat
---------------------
1956 0.216662
1957 0.222889
1958 0.229115
1959 0.236529
1960 0.243945
1961 0.24
1962 0.24
1963 0.24
1964 0.24
1965 0.24
1966 0.24
1967 0.24
1968 0.24
1969 0.24
1970 0.24
1971 0.24
1972 0.24
1973 0.24
1974 0.24
1975 0.24
1976 0.24
1977 0.24
1978 0.24
1979 0.24
1980 0.24

resid-fudge.dat:
---------------------
1956 0.216662
1957 0.222889
1958 0.229115
1959 0.236529
1960 0.243945
1961 0.25247
1962 0.260995
1963 0.270529
1964 0.280065
1965 0.290492
1966 0.300919
1967 0.312104
1968 0.323288
1969 0.335084
1970 0.346879
1971 0.359128
1972 0.371377
1973 0.383913
1974 0.396452
1975 0.409108
1976 0.521766
1977 0.534374
1978 0.546982
1979 0.559376
1980 0.771771

Nov 25, 2009 at 1:36 AM | Unregistered CommenterRyan

Lets take a look at exactly how the FORTRAN (glad to see someone still appreciates the
original programming language) programs briffa_sep98_e.pro & briffa_sep98_d.pro

"****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********"

my comments are within [[[ ]]], otherwise the code snippets are as they
appear in the file.


;
; PLOTS 'ALL' REGION MXD timeseries from age banded and from hugershoff
; standardised datasets.
; Reads Harry's regional timeseries and outputs the 1600-1992 portion
; with missing values set appropriately. Uses mxd, and just the
; "all band" timeseries
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
;
yrloc=[1400,findgen(19)*5.+1904]


[[[ this creates 20 consequtive 5 year subsets (possibly averaged) of the tree ring data by date starting in year 1904 ]]]


valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,
2.6,2.6,2.6]*0.75 ; fudge factor


[[[ these are the 20 different "fudge factor(s)"-the programmer's words not mine -
to be applied to the 20 different subsets of data, so here are those fudge factors
with the corresponding years for the 20 consequtive 5 year periods:

Year Fudge Factor
1904 0
1909 0
1914 0
1919 0
1924 0
1929 -0.1
1934 -0.25
1939 -0.3
1944 0
1949 -0.1
1954 0.3
1959 0.8
1964 1.2
1969 1.7
1974 2.5
1979 2.6
1984 2.6
1989 2.6
1994 2.6
1999 2.6

a little further down the program adjusts the 20 datasets with the corresponding fudge factors: ]]]

;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj

[[[ So, we leave the data alone from 1904-1928, adjust downward for 1929-1943, leave the same
for 1944-1948, adjust down for 1949-1953, and then, whoa, start an exponential fudge
upward (guess that would be the "VERY ARTIFICIAL CORRECTION FOR DECLINE" noted by
the programmer). Might this result in data which don't show the desired trend or god forbid show a
global temperature "DECLINE" after "VERY ARTIFICIAL CORRECTION" turn into a hockey schtick - I mean stick ? and "HIDE THE DECLINE"? You bet it would!

Nov 25, 2009 at 1:56 AM | Unregistered Commentermark

Also, the numbers in resid-fudge.dat differ from all of the other resid_xxxxxxx.dat files in the same directory from 1976-1979 by 1/10 of a degree, and 3/10 of a degree in 1980.

resid-best.dat (explained in a previous post)
1976 0.24
1977 0.24
1978 0.24
1979 0.24
1980 0.24

All other resid_xxxxx.dat files except "resid-fuge.dat":
1976 0.421766
1977 0.434374
1978 0.446982
1979 0.459376
1980 0.471771

resid-fudge.dat
1976 0.521766 (0.1 higher)
1977 0.534374 (0.1 higher)
1978 0.546982 (0.1 higher)
1979 0.559376 (0.1 higher)
1980 0.771771 (0.3 higher)

Nov 25, 2009 at 2:09 AM | Unregistered CommenterRyan

This "correction for decline" is described in the Osborn, Briffa, Schweingruber, Jones (2004) paper cited above (Annually resolved patterns of summer temperature over the Northern Hemisphere since AD 1400).

"To overcome these problems, the decline is artificially removed from the calibrated tree-ring density series, for the purpose of making a final calibration. The removal is only temporary, because the final calibration is then applied to the unadjusted data set (i.e., without the decline artificially removed). Though this is rather an ad hoc approach, it does allow us to test the sensitivity of the calibration to time scale, and it also yields a reconstruction whose mean level is much less sensitive to the choice of calibration period."

In my opinion the approach requires more justification than provided in the article, but it isn't the smoking gun some are making it out to be.

Nov 25, 2009 at 2:42 AM | Unregistered CommenterMorgan

Morgan,
Thanks for the reference. Any ideas why they did not provide the "fudge factors" used in the paper, why said fudge factors are used to increase or decrease or leave the same different sets of data (the paper in effect says only increase because of a recent decline- and why that is permissable is questionable), why there is a huge, almost exponential increase in the fudge factors after 1958. I find no explaination of this. The programming files also use the words VERY ARTIFICIAL... , but the papers make this sound so routine and don't use that adjective. Why use the word VERY unless you are implying "too much." Correct me if I'm wrong, but the programs proceed to plot out the fudged data and do not plot the data "without the decline artificially removed." If you are only using the fudged data for calibration purposes, why plot it? Just asking.

Nov 25, 2009 at 3:58 AM | Unregistered Commentermark

also, if you are using "fudged" data to create a fudged calibration and then apply the fudged calibration to the raw data, don't you still end up with "fudged data?" i.e. GIGO? Just asking.

Nov 25, 2009 at 5:31 AM | Unregistered Commentermark

Mark

YEs, that's my immediate reaction. I can't lay my hands on a copy of the paper though, so I'm not really sure what the explanation is, if indeed it's a valid one.

Nov 25, 2009 at 6:03 AM | Registered CommenterBishop Hill

here it is (in http format):

http://74.125.155.132/search?q=cache:q9oMbzZJrIEJ:www.cru.uea.ac.uk/~timo/papepages/pwosborn_summertemppatt_submit2gpc.pdf+Annually+resolved+patterns+of+summer+temperature+over+the+Northern+Hemisphere+since+AD+1400+briffa+filetype:pdf&cd=1&hl=en&ct=clnk&gl=us

Nov 25, 2009 at 6:12 AM | Unregistered Commentermark

mark:

That's the version I was working from.

"Any idea why they did not provide the "fudge factors" used in the paper?" Not really, except that there probably was no justification for the actual numbers (beyond "they work") and that the editor didn't require them to do so.

"why said fudge factors are used to increase or decrease or leave the same different sets of data?" Are you referring to the "fudge factors" having both positive and negative values (and zeroes)? Or are you saying that the factors themselves are sometimes added to the raw data, sometimes subtracted from them, and sometimes not used? Assuming you meant the former, it looks like they are trying to get a smoothly increasing time series over the calibration period.

The increase after 1958 is, I assume, because that is when the "divergence problem" kicks in (I'm very skeptical about the issue, but not surprised that Briffa et al. feel justified in correcting it). Why does the correction have the shape it does? Your guess is as good as mine.

Why plot the data without removing the fudge factors? That's a mystery, though I guess you'd want to visually inspect the series to make sure you hadn't cocked up the code. But if that's the case, why not plot the whole thing? Dunno.

GIGO? I suppose so, but how garbage-y? I don't know enough about the "calibration" to hazard a guess. All I'm pointing out is that the "very artificial adjustment" is applied as an intermediate step to allow this "calibration" to proceed, not to the final data (or at least that's what they claim). So it isn't the final reconstruction that is "made up", which is what some comments indicated.

Nov 25, 2009 at 7:53 AM | Unregistered CommenterMorgan

I've made some more comments here - http://www.di2.nu/200911/25.htm

Nov 25, 2009 at 9:54 AM | Unregistered CommenterFrancisT

Regarding searching in windows.

open a cmd shell and navigate to the required folder using 'cd foldername', tab key can be used to autocomplete the name.( winxp )

use 'find /i "trick" *.*' to perform a case insensitive search. This works better than the Windows Explorer search which queries the index catalogs which I disable anyway.

try these commands at a cmd shell: > sends the output of find to the file
find /i "decline" *.* > search-result.txt
notepad search-result.txt

Nov 25, 2009 at 12:57 PM | Unregistered CommenterMyrddin Wyn

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>