Science

Recent “Big data” project raising serious doubt over veracity of climate change models

What would happen if you fed hundreds of years of climate data into a neural network? What if you then asked it to predict how the climate would change beginning in the 1800s, before the Industrial Revolution arrived? Someone has done just that. The results are astonishing…

Climate change scientists build simplified climate models. They use them to try and predict how earth’s climate will change over time. The problem is, the earth is a very complex system.

So, the models that we build are simplified. The scientists make some assumptions in order to simplify the global system and reduce it down to something we can manage. It’s extremely precarious work. We don’t even fully understand how all the earth’s sub-systems connect to form its overall weather system, so undoubtedly we’re missing some things.

They may be reasonably accurate within certain bounds, but we shouldn’t be prepared to die on a hill defending their infallibility.

Using these models, and applying some fudge factors, climate change “scientists” have predicted disastrous future global warming. They point to data showing that the earth has been heating up, and that this is only the beginning. They claim we must act now to stop further warming because the models show the planet will melt, or something.

But they blame this heat-up on the Industrial Revolution and human action, which unleashed unprecedented compound economic growth of three percent per annum for over 200 years. That slight growth, imperceptible year-to-year, changed the world forever in a very short time frame.

RIVAL METHODS OF DATA ANALYSIS

This coming future warming is supposed to be bad for us. They’d be perfectly happy if we just shut down all our power plants so that the 2 or 3 percent of us who survived the first 60 days in the ensuing economic meltdown could forage for food for the rest of our lives.

But there are other ways to analyze the data. Statistical analysis is one such way. So are neural networks, which are computer algorithms that model the human brain. They can crunch large numbers and sift almost imperceptible patterns from them in what is called “big data.”

The computer algorithms can analyze large volumes of data in seconds that human minds could never fully grasp or adequately sort. One group of scientists decided to feed one of these programs hundreds of years of historical climate data.

So, our new technical paper in GeoResJ (vol. 14, pages 36-46) will likely be ignored.  Because after applying the latest big data technique to six 2,000 year-long proxy-temperature series we cannot confirm that recent warming is anything but natural – what might have occurred anyway, even if there was no industrial revolution.

. . . .

During the past year, we’ve extended this work to estimating what global temperatures would have been during the twentieth century in the absence of human-emission of carbon dioxide.

We began by deconstructing the six-proxy series from different geographic regions – series already published in the mainstream climate science literature.  One of these, the Northern Hemisphere composite series begins in 50 AD, ends in the year 2000, and is derived from studies of pollen, lake sediments, stalagmites and boreholes.

SAME RESULTS, DIFFERENT CAUSES

Their computer model comes up with the same results as the climate change models did: one degree of warming through the year 2000. The main difference is this: the climate change models say the 1-degree increase in average global temperature is mankind’s fault. But this neural network shows that the planet would have heated up anyway, even if we didn’t increase our CO2 production:

In our new paper in GeoResJ, we make the assumption that an artificial neural network – remember our big data/machine learning technique – trained on proxy temperatures up until 1830, would be able to forecast the combined effect of natural climate cycles through the twentieth century.

Using the proxy record from the Northern Hemisphere composite, decomposing this through signal analysis and then using the resulting component sine waves as input into an ANN, John Abbot and I generated forecasts for the period from 1830 to 2000.

Our results show up to 1°C of warming. The average divergence between the proxy temperature record and our ANN projection is just 0.09 degree Celsius. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been warming through the twentieth century – to at least 1980, and of almost 1°C. [Emphasis added.]

The climate change scientists, using their climate models, pin the blame on us. This helps them secure more government funding. The neural network model, on the other hand, examines historical climate patterns and says their’s nothing we could have done to prevent the present warm period. The Middle Ages were warmer until the Little Ice Age arrived. That led to the invention of the chimney, among other things.

CONCLUSION

Claiming that the present warming is just part of a normal planetary cycle doesn’t motivate governments to fund research into this problem. So the climate-change liberals and un-scientists heap odium on alternative models that contradict the conclusions of their own.

This will stop when the welfare state goes bankrupt. Then, when research spending becomes more restrained and focused, we’ll be able to clear away some of this noise and come closer to scientific truth.

Previous post

It's Time to Disband the NAACP, the Congressional Black Caucus, and Black Lives Matter

Next post

Should Black-Owned Bakeries Be Forced to Make Confederate-Themed Cakes?