jump to navigation

Problems with the Climate Models September 15, 2008

Posted by honestclimate in Climate Models.
Tags: , , , ,
trackback

Problems with the Climate Models

Professor Michael R.Fox

Professor Michael R. Fox

By Dr Professor Michael R. Fox

From Hawaii Reporter, September 12, 2008

Recalling that people such as Robert F. Kennedy have called climate skeptics “traitors”, David Suzuki calls for their jailing, the Grist website called for Nuremburg trials for them, NASA’s Dr. Jim Hansen calling for their trials for treason, along with the habitual insults from Al Gore, its been difficult for anyone to respectfully dissent. It’s been difficult to stick to the rules of hard science, by demanding evidence and replication, both of which require questioning but are often followed by insults and threats.The world owes a lot to many climate scientists who are closely studying and reviewing the claims of the global warming lobby. They are also attempting to replicate some of these findings without the traditional support of the originating authors. Ordinarily, in the world of hard nosed science, such scrutiny and replication has been historically welcomed. No longer. The well-known name calling, the dismissiveness, the ad hominem attacks, is regrettably now the standard level of discourse. Additionally, these include many laboratory directors, media editors, and Ph.D.s who for whatever reasons adopt the same low roads of discourse and the abandonment of science.

These are difficult times for traditional climate scientists who do practice good science, serious peer review, welcome scrutiny, replication, and the sharing of data. Thanks to the whole world of the global warm-mongers and indentured PhDs, the integrity of the entire world of science is being diminished, followed by a loss of trust and respect.

Among the giants challenging the global warming dogma has been Christopher Monckton. He has been a strong international leader, spokesman, and expert in unraveling the complexities of the man-made warming hypothesis.

The greatest drivers behind the hypothesis have not been the actual evidence, but computer models. Relative to the largely unknown climate complexities, these are still known to be primitive and incapable of replicating climate data as measured from observations. If a hypothesis can’t explain actual evidence and climate observations, it is wrong, and needs to be modified or abandoned.

In a recent exchange with an expert modeler and believer of global warming, Monckton responded in incredible detail by identifying many of the problems found with the computer models themselves. Monckton is impressively expert in the minutiae of computer modeling, a skill which applies directly to the analyses of the computer climate models. Monckton has performed a detailed analysis of the IPCC’s hypothesis of global warming and identified a long list of failings. They deserved much wider distribution, with an understanding of the serious implications. They and literature references can be found here (http://tinyurl.com/6edjzo).

Monckton is not alone in his concerns with computer modeling. Tens of thousands of scientists and engineers who have taken basic mathematics know of the problems and complexities with modeling even simple situations. This author has met a fellow scientist (a bit nerdy admittedly) who carried a long multi-variable multi-term equation on a paper kept in his wallet, which was the equation of the outline of his wife’s face. The modeling problem is delightfully defined by atmospheric physicist Dr. James Peden, who recently said Climate Modeling is not science, it is computerized Tinkertoys, with which one can construct any outcome he chooses.. And for my nerdy modeler above, it’s easy to change his wallet equation if he gets a new wife!

Monckton’s analyses are summarized in a number of points below, which are devastating to the hypothesis and computer modeling. These have profound implications for policy makers and the energy and economic future of our country. We’d better learn these: Point 1: There are… serial, serious failures of the computer models of climate

….the computer models upon which the UN’s climate panel unwisely founds its entire case have failed and failed and failed again to predict major events in the real climate.

a. The models have not projected the current multidecadal stasis in “global warming”:

b. no rise in temperatures since 1998; falling temperatures since late 2001; temperatures not expected to set a new record until 2015 (Keenlyside et al., 2008).

c. nor (until trained ex post facto) did they predict the fall in TS from 1940-1975;

d. nor 50 years’ cooling in Antarctica (Doran et al., 2002) and the Arctic (Soon, 2005);

e. nor the absence of ocean warming since 2003 (Lyman et al., 2006; Gouretski & Koltermann, 2007);

f. nor the behavior of the great ocean oscillations (Lindzen, 2007),

g. nor the magnitude nor duration of multi-century events such as the Mediaeval Warm Period or the Little Ice Age;

h. nor the decline since 2000 in atmospheric methane concentration (IPCC, 2007);

i. nor the active 2004 hurricane season;

j. nor the inactive subsequent seasons;

k. nor the UK flooding of 2007 (the Met Office had forecast a summer of prolonged droughts only six weeks previously);

l. nor the solar Grand Maximum of the past 70 years, during which the Sun was more active, for longer, than at almost any similar period in the past 11,400 years (Hathaway, 2004; Solanki et al., 2005);

m. nor the consequent surface “global warming” on Mars, Jupiter, Neptune’s largest moon, and even distant Pluto;

n. nor the eerily-continuing 2006 solar minimum;

o. nor the consequent, precipitate decline of ~0.8 °C in surface temperature from January 2007 to May 2008 that has canceled out almost all of the observed warming of the 20th century.

As Monckton states, the computer models are demonstrable failures.

Point 2: The IPCC’s method of evaluating climate sensitivity is inadequate and error-laden Monckton showed that the IPCC’s method of evaluating climate sensitivity can be reproduced by nothing more complicated than a few equations which, if the IPCC’s values for certain key parameters are input to them, generate the IPCC’s central estimate of climate sensitivity to a high precision. Nowhere else has this method been so clearly or concisely expounded before.

And, once the IPCC’s method is clearly seen for what it is, it is at once apparent that their method suffers from a series of major defects that render it useless for its purpose. The laboratory experiments that form the basis for estimates of forcings do not translate easily to the real atmosphere, so that the IPCC’s claimed “Levels of Scientific Understanding” for the forcings are exaggerated; its estimates of the feedbacks that account for two-thirds of total forcing are subject to enormous uncertainties not fairly reflected in the tight error-bars it assigns to them; the feedback-sum is unreasonably close to the point of instability in the Bode feedback equation (important in the study of circuit [and climate] feedbacks), which has in any event been incorrectly used for amplification in a chaotic system, when it was designed only for systems whose initial state was linear; the IPCC’s value for the no-feedbacks climate sensitivity parameter is the highest in the mainstream literature, and is inconsistent with the value derivable from the 2001 report; the value of this and other parameters are not explicitly stated; etc., etc.

Point 3: The IPCC’s value for climate sensitivity depends upon only four scientific papers Climate sensitivity is the central – properly speaking, the only – question in the debate about the extent to which “global warming” will happen. Monckton’s presentation of the IPCC’s method of calculating how much the world will warm in response to a doubling of CO2 concentration shows that the IPCC’s values for the three key parameters whose product is climate sensitivity are taken not from 2,500 papers in the literature but from just four papers. Had a wider, more representative selection of papers been relied upon, a far lower climate sensitivity would have resulted.

Point 4: Uncertainty in evaluating climate sensitivity is far greater than the IPCC admits The IPCC baselessly states that it is 90% sure we (humans) caused most of the observed warming of the past half-century (or, more particularly, the warming in the 23 years between 1975 and 1998: the remaining 27 years were in periods of cooling). However, the uncertainties in the evaluation of climate sensitivity are so great that any conclusion of this kind is meaningless. None of the three key parameters whose product is climate sensitivity can be directly measured; attempts to infer their values by observation are thwarted by the inadequacies and uncertainties of the observations depended upon; and, in short, the IPCC’s conclusions as to climate sensitivity are little better than guesswork.

Point 5: The published literature can be used to demonstrate lower climate sensitivity The second part of Monckton’s paper examines the literature on climate sensitivity. A surprisingly small proportion of all papers on climate change consider this central question. The vast majority concentrate on assuming that the IPCC’s climate-sensitivity estimate is right and then using it to predict consequences (though, as Schulte, 2008, has shown, none find that the consequences are likely to be catastrophic). Monckton demonstrates, using papers from the literature, that it is at least as plausible to find a climate sensitivity of <0.6 C as it is to find the IPCC’s 3.3C ( a factor of 5— such a large uncertainty does not inspire confidence).

Point 6: Even if climate sensitivity is high, adaptation is more cost-effective than mitigation Monckton concluded as follows: “Even if temperature had risen above natural variability, the recent solar Grand Maximum may have been chiefly responsible. Even if the sun were not chiefly to blame for the past half-century’s warming, the IPCC has not demonstrated that, since CO2 occupies only one-ten-thousandth part more of the atmosphere that it did in 1750, it has contributed more than a small fraction of the warming.

Monckton’s analysis here is a major contribution to understanding a difficult subject. He has broken through the dense modeling processes, not to mention the ad hominem attacks, in such a way that many more can understand its weaknesses.

It is time to break the relationship between energy policy and computer forecasting. The models are not sources of climate information so badly needed to formulate rational energy policy without the threats of economic suicide. The economic and energy future of our nation should not rest so completely on such primitive modeling.

It is well beyond the time when the policy makers, the educators, and the media, demand evidence instead of scare stories. Glossy documentaries won’t do.

As Dennis Avery said recently, co-author of the book “Unstoppable Global Warming”, “We look forward to a full-scale exploration of the science. We have heard quite enough from the computers”.

Michael R. Fox, Ph.D., a science and energy reporter for Hawaii Reporter and a science analyist for the Grassroot Institute of Hawaii, is retired and now lives in Eastern Washington. He has nearly 40 years experience in the energy field. He has also taught chemistry and energy at the University level. His interest in the communications of science has led to several communications awards, hundreds of speeches, and many appearances on television and talk shows.

http://www.hawaiireporter.com/story.aspx?bcb0b0a8-86dc-4f0d-acce-dec9605c9b7a

About these ads

Comments»

1. Peter Taylor - September 25, 2008

I am also wrestling with this question of modelling – as an environmental scientist with a long history of critical review of modelling and prediction in other areas of pollution control – and I am in discussion with several of the key modellers that support the global warming consensus – it is not cut-and-dried, for sure – they point out that the models expect variability – and the recent fall is not that much greater than that which followed the 1998 and 2002 El Ninos – the test will be whether there is a rebound and whether the current solar minimum is extended and can be seen to have an influence (noting that the modellers don’t regard these solar factors as significant drivers) – my main criticism is that the models cannot replicate the major oceanic oscillations – and hence it would not be possible to allocate a percentage influence of natural/man-made for the last 30 years when three such cycles have been positive. My own very simplistic calculations give greenhouse gases 20% of the driving force as a maximum – with industrial CO2 thus contributing about 11% of the signal (cutting that, even by 80%, will have no effect). My reasoning can be found at http://www.ethos-uk.com/downloads/climate

2. Steve M. - October 8, 2008

Peter,
1) If a modeler is pro-AGW, I would expect they would produce a pro-AGW prediction. I don’t trust too much of “hey I went looking for this and found it.”

2) 30 year cycles match the PDO cycles. 30 years of warming due to a warm PDO, and now we’ve switched to a cool PDO…steady to declining temperatures.

I think with the Sun appearing to go into a minimum of some kind, and coinciding with a cool PDO, we could see some serious cooling, but that is too be seen.

3. RodD - December 13, 2008

The co2-rruption of the IPCC can be seen in one defining attribute; their unwilliness to release the raw data they put into models. One would think that if the IPCC feels they are credible they would want to open their files. Instead their top secret attitute decisively destroys credibilty. Millions starve do their mindless regulations including carbon taxes, biofuels from corn, and forcing countries to build expensive unproven alternate energy projects making the word “sustainable planning” dirty words. Reducing C02 levels by 80% would dismantle civilization as we know and bring a new dark age with likely wars which would totally destroy the eccology of the universe. Any survivors of this calamity would be a random few.

4. Guido - January 30, 2009

By means of an econometric model, I have demonstrated that IPCC-like conclusions are wrong. Natural and not human-made causes are proven to be the major culprit of global warming and, in addition, the Earth is headed for cooling in the near future. This is why peer reviewing at Climatic Change (Springer) has rejected my article. See:
http://ideas.repec.org/p/pra/mprapa/7108.html


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 31 other followers

%d bloggers like this: