Saturday, December 27, 2008

Global warmism and public deception

From an article by climatologist Dr. Tim Ball in the Canada Free Press:
E. R. Beadle said, “Half the work done in the world is to make things appear what they are not.” The Intergovernmental Panel on Climate Change (IPCC) does this with purpose and great effect. They built the difference between appearance and reality into their process. Unlike procedure used elsewhere, they produce and release a summary report independently and before the actual technical report is completed. This way the summary gets maximum media attention and becomes the public understanding of what the scientists said. Climate science is made to appear what it is not. Indeed, it is not even what is in their Scientific Report.
...
This and similar statements are based on the unproven hypothesis that human produced CO2 is causing warming and or climate change. The evidence is based solely on the output of 18 computer climate models selected by the IPCC. There are a multitude of problems including the fact that every time they run them they produce different results. They use an average of all the runs. The IPCC then take the average results of the 18 models and average them for the results in their Reports.

Tim Palmer, a leading climate modeler at the European Centre for Medium - Range Weather Forecasts said, “I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.” This comment is partly explained by the scale of the General Circulation Models (GCM). The models are mathematical constructs that divide the world into rectangles. Size of the rectangles is critical to the abilities of the models as the IPCC AR4 acknowledges. “Computational constraints restrict the resolution that is possible in the discretized equations, and some representation of the large-scale impacts of unresolved processes is required (the parametrization problem)." (AR4 Chapter 8. p.596.)

The IPCC uses surface weather data, which means there is inadequate data in space and time for most of the world to create an accurate model. Limitations of the surface data are surpassed by an almost complete lack of information above the surface. An illustration of the surface problem is identified by the IPCC comment of the problems of modeling Arctic climates.
...
The very large area labeled “No Data” covers most of the Arctic Basin, an area of approximately 14,250,000 km2 (5,500,000) square miles). Remember, certainties of arctic ice conditions are core to Gore’s alarmism.

In the Southern Hemisphere the IPCC identifies this problem over a vast area of the Earth’s surface. “Systematic biases have been found in most models’ simulation of the Southern Ocean. Since the Southern Ocean is important for ocean heat uptake, this results in some uncertainty in transient climate response.” (AR4. Chapter 8. p. 591.)

Atmosphere and oceans are fluids governed by non-linear rather than linear equations. These equations have unpredictability or randomness - also known as chaos – it explains why the models get different results every time they are run. These problems well known outside of climate science were specifically acknowledged in the IPCC Third Assessment Report (TAR), “In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” (TAR, p.774.)

Article here.

No comments: