English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If there is a divergence between proxy and instrument temperature data, what conclusions am I to draw?

If the proxies are not accurate in the current decades (compared to instruments), then how can I be assured that the historical proxies are accurate?

If the instruments are not accurate in current decades (compared to proxies), then how can I be assured that instrumental data is accurate?

It appears that there is a lot of work being done to eliminate this divergence, but it appears to be mathematical in nature. Is anyone working on refining the experimental parameters in the determination of temperature via proxy analysis?

http://www.agu.org/journals/jd/jd0717/2006JD008318/2006JD008318.pdf

Hopefully this link works.....

2007-09-19 07:10:33 · 4 answers · asked by Marc G 4 in Environment Global Warming

Keith-->

I can't fix the link, I think you need access to the AGU papers.

2007-09-19 08:32:23 · update #1

Trevor-->

You don't have access to the AGU journals?

Here is a link that may work. It is for the abstract:

http://www.agu.org/pubs/crossref/2007.../2006JD008318.shtml

I don't know if it will work, since it is via the AGU as well.

2007-09-19 11:11:27 · update #2

4 answers

Similarly, I couldn't access the website (password required or need to purchase the report) so this answer will be somewhat restricted.

In climatology the term 'proxy' has a different meaning to that used in other sciences and the general vernacular. It's used to refer to data or information which by itself is of limited value but from which can be derived a variable of interest. Oxygen isotope and dendrological analyses and a couple of examples.

By definition, proxy data is known to be unreliable or of little consequence. I'm wondering if what you're referring to is 'reconstructed data', this being data preceding the instrumental record.

It's a shame I can't access the link as this is the type of thing that some skeptics round on and then pronounce 'oh look, the proxy data is wrong' whilst conveniently omitting to add that climatologists know it's wrong which is why it's not directly used.

What we do find is that the instrumental and reconstructed temperature records are remarkably accurate. There are some methods that are more accurate than others and the further back in time you go the lower the level of confidence. Such that isotopic analysis of multi-cellular organisms from half a billion years ago can only provide an average global temperature to within an accuracy of 1°C. This might sound fairly accurate but compared to what we can do for a million or a thousand years ago it's way off the mark.

No single data set can provide accurate reconstructed temps any further back than 800,000 years but by taking an average of several data sets and repeating the same research then a greater degree of accuracy can be achieved.

A good way to gauge the accuracy of a method is to reconstruct recent temps then compare the results with the instrumental record for that same period. The methodology can then be tweaked if needs be.

Another barometer test is to use several approaches to the reconstruction of temperatures and compare the results. The closer they are the greater the accuracy. If a set of results deviates significantly from the mean then it calls for an investigation.

Climatology is a young science and advances are continually being made. Most data sets are routinely anomalously adjusted, as advances are made data is revisited and homogeneity adjustments are made. It's not an attempt to 'eliminate divergences' but an overall improvement to the accuracy of the data.

One example of this that you may be aware of is that the hottest year on record for the US was until recently, beleived to be 1998. After homogeneity adjustments were made to the GISTemp record the new record holder is 1934 (if you read in the media that it was anything to do with the Y2K bug then ignore it, I think the media decided this was the case because in 2000 the methodology changed and there was a discrepancy between the pre and post 2000 data).

As the science of climatology improves and technological advances are made there will undoubtedly be further revisions to the data. Such revisions will be small, the changes that are being applied now are in the order of four decimal places.

- - - - - - - - -

RE: YOUR ADDED DETAILS

Thanks Marc for the added link, I can access the abstract from there. I couldn't access the AGU site before as I'm at home, I'm now hooked up to the office comp and am attempting to download from there but it's not playing ball. I've ftp'd the datasets but not the written article and everything that goes with it.

First impression is one of noise... but why? There appears to be considerable mismatches with individual readings out by perhaps as much as 2.5°C. Not having any totals or aves makes comparison difficult, I suspect the means will be closer, a quick sum of 30 from the first set (Tatras) is pretty close to the instrumental mean.

Probably when averaged out or 10, 20, 30 year means are taken it will be more in line with other reconstructed data and the instrumental record.

There are clear trends and the overall picture is the same as other methods but there's a much greater degree of variability in the readings. I'm only looking at individual sites which doesn't help but all the same, other reconstructed records from individual sites are generally more consistent.

Can't really comment on any divergence as all I have to work with are lists of numbers (would take too long to copy into a spreadsheet and analyse).

I don't normally work with raw tree ring data, maybe what appear to be anomalies are normal. I much prefer the ice core records, I'm directly involved with this and the data is reliable.

If the report ever downloads I'll come back to the question and add more details.

2007-09-19 10:36:28 · answer #1 · answered by Trevor 7 · 2 0

Well, the link doesn't work, sorry.

There might be a lot of reasons for proxy divergence, depending on the proxy. But I suspect that what you're concerned with most is tree-ring temperature proxies. The maximum density of a tree-ring has been shown to be a good proxy for overall summer-spring temperature in a given year.

Tree-ring proxies have broken down in the latter 20th century, and this is believed to be due to the effects of air pollution, principally ozone and its influence on UV radiation. Fortunately, we have actual thermometer records from this period, so the breakdown is not terribly significant to paleoclimate science.

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VF0-49G5SBP-1&_user=10&_coverDate=01%2F31%2F2004&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=46100852a2c649210b281a7bbb93ec3b

2007-09-19 08:16:30 · answer #2 · answered by Keith P 7 · 2 0

It's hard to comment without seeing your data. If you googled [your reference] abstract, you probably could at least come up with an abstract for us.

But the stuff I've seen shows excellent consistency between 20th century data and proxy data. Here are several examples on one graph.

http://www.globalwarmingart.com/wiki/Image:1000_Year_Temperature_Comparison_png

I'm sure some research on your part would yield many other such examples.

EDIT - OK I looked the the abstract. It said the proxy "slightly" under predicted temperatures. Could you share the average amount of the under prediction?

2007-09-19 09:40:28 · answer #3 · answered by Bob 7 · 2 0

Irenaeus grew to become into in a conflict of techniques with Marcion, a frontrunner of a Gnostic offshoot team. Marcion is the guy who first proposed settling disputes by utilising picking a canon, and of direction he proposed a itemizing of works chosen for his or her help of his perspectives. Irenaeus felt that Marcion's decision grew to become into slanted and made a counter-theory, yet known that this might substitute into yet another conflict of comments quite than a for sure resolvable question till some uncomplicated concept defined which books have been to be risk-free. in spite of the poetic expression interior the line you quote, Irenaeus did, in actuality, advise considered one of those concept: the hot testomony books have been to be those with some connection to Apostolic authority. So the 4 gospels have been chosen, from between a bigger set of purported bills of Jesus' existence, because of the fact they had that connection: Matthew and John have been (or have been believed to be) between the unique twelve disciples, Mark grew to become into a thorough affiliate of Peter, and Luke heavily related to Paul. a similar concept grew to become into utilized constantly (in spite of if in a minimum of one case, Hebrews, an attribution to Paul seems to have been contrived). between the products excluded, no longer all have been on Marcion's or the Gnostics' element: the books of "the shepherd of Hermas" have been a great deal valued interior the early church yet lacked any apostolic claims. i'm unsure what you talk "present day" situations; the Protestant canon grew to become into the instantaneous subject of Protestant leaders contained for the period of Luther, and the Roman Catholic source from which they chosen it were extensively stabilized because of the fact the fourth century.

2016-10-05 00:26:20 · answer #4 · answered by ? 4 · 0 0

fedest.com, questions and answers