Models

Back
A lot of climate science is based on "modelling". Weather is very complicated and can only be predicted by feeding a lot of data into very fast, very powerful computers. Meteorologists are getting much better at this than they were even 30 years ago, but they still cannot predict weather more than a few days ahead and even then, imperfectly. That is because the number of factors needed to do significantly better is near the limits of the ability of even very large, very fast computers to work out, even if the enormous amount of data needed could be entered in time.

Climate is even more complicated. I have heard estimates that there are over five million different factors (that we know about) that are known to affect climate. So the "models" that are used by the climate change industry to generate their (usually alarming) predictions are unlikely to be even as good as the meteorological computers at predicting climate.
I could be wrong (and have you ever heard an advocate of man-made global warming say that?), but it seems to me very unlikely that, if the Met Office cannot generate weather forecasts very accurately even for a few months ahead, the Climate Modellers can be expected to be as certain as some people claim about climate predictions 50 or 100 years into the future.

Modelling Uncertainties

The output of a computer model depends on two basic factors: The size and complexity of a computer program needed to handle over 5,000,000 inter-related factors is (as far as I know) well beyond the capabilities of current knowledge or of any computer systems yet available. So the models have to be simplified. That makes them unreliable. Look at some of the references here and here and follow them onwards if you want more detail. I do not pretend to be an expert in these areas, but it is alleged that some of the models will produce the same outputs, no matter what data are fed into them. I certainly find it most improbable that computer models based on inadequate sub-sets of theory can be regarded as reliable to predict climate years into the future.

Most people who have used computers are familiar with GIGO - garbage in, garbage out. It is not only the sets of factors in the programs that have to be simplified to make a model that is programmable and useable, the data have also to be simplified so that the programs can handle them. You may well wonder whether this is not likely to degrade the outputs. I know what I think.

Another Uncertainty

Non-technical readers can skip this bit. There is a very fundamental bit of science, quite well established for at least a century, that says that nothing is certain. Known as the Heisenberg Uncertainty Principle, its effects are not noticeable for ordinary real-world observations by objects of about human size, but do need to be considered when modelling things that may be affected by the behaviour of atoms and molecules and also, paradoxically enough, cosmic events.

I, frankly, cannot do the maths, but for those of my readers who can, here are some links.
Uncertainty Principle - From Wikipedia
A very brief account from the University of Oregon
From Hyperphysics Concepts - heavy stuff!
My understanding is that what it means is that there are built-in uncertainties about the outcomes of any models and that those uncertainties multiply as the models become more complicated.
I am not making any claims or assertions about how much unreliability the Uncertainty Principle introduces into the claims for alarmist Climate Change models - maybe none - but my mission is to tell as much of the truth as I can lay my hands on. It is up to my readers (if there are any) to decide how relevant it is.

Top
Back
Updated 14 Apr 07