February 6, 2020 JV

Heat Island

In the 2010 thriller “Shutter Island,” Leonardo DiCaprio’s character, Teddy Daniels struggled to uncover the island’s hidden secret.  The film was a Scorsese classic, obscuring the truth till the very end.  Could DiCaprio now turn his skills to discovering the truth about the Urban Heat Island effect?  Or has it been lost forever within the Climate Research Unit’s Global Climate Dataset?  More on this later.

James Hansen et al concluded in a 1999 report that “the effect of this [UHI] adjustment on global temperature change was found to be small, less than 0.1° C for the past century.”  However, this is not consistent with empirical evidence. 

Let’s compare Hansen’s conclusion with some real-world examples. Firstly, consider Tokyo and Hachijo, an island 290 km south of the city.  Tokyo’s temperature has risen by about 2° C in the last 100 years, while that of Hachijo has risen by less than 0.5° C.

Next, consider a prominent US-based example.  Long-term visitors to Las Vegas, Nevada, would realise just how large a city it has become.  Walking the strip at night reinforces the fact that things have gotten hot, but why?  A thermal image sheds a unique insight.  Note from the graph below that Las Vegas’ average annual maximum temperature has not risen appreciably between 1937 and 2012, yet, the minimum temperature had risen, during this same period, by about 6° C.

Las Vegas’ official 3° C rise in average daily temperature is an arithmetic construct manufactured by the continual rising minima.  This night-time warming is symptomatic of the UHI effect, although GISSTEMP assumes the impact to be negligible.  If the maximum temperature had also risen by a similar amount then it could be argued that the minima rise was due to the RGHE.  However, that is simply not the case.  Unfortunately, the Las Vegas temperature record is typical of many urban stations affected by a significant UHI bias, yet remain tainted by it.

B-b-b-Benny and the Jets

One final example of how the UHI effect is corrupting the global average temperature record comes from the Syracuse AP station at the Syracuse airport in New York State.  In an article titled “New York’s Temperature Record Massively Altered By NOAA” Paul Homewood explains why the New York winter of 2014 ranked as the 30th coldest in history, despite news stories such as this one from the BBC:

“The winter of 2013-14 finished as one of the coldest winters in recent memory for New York State.  Snowfall across Western and North Central New York was above normal for many areas, and in some locations well above normal. … Temperatures this winter finished below normal every month, and the January through March time-frame finished at least 4 degrees below normal for the two primary climate stations of Western New York (Buffalo and Rochester).”

Despite these reports, NOAA data recorded Syracuse AP temperatures 4 degrees above the six other stations in the district, compared with their measurements from the 1940’s.  It can be argued that the relatively higher Syracuse AP temperatures were due to the UHI effect as the weather station is placed near a runway at an airport that has grown steadily in traffic volume over the decades.  The airport was used as a flight training centre during the 1940’s and is now an International airport, handling 2 million passengers per year.

The data corruption at Syracuse is also evident in several other NY state districts, causing the average minimums to appear higher than in previous years, despite the severe cold.  As will be confirmed later in this series of articles, airports are now the dominant location for LST US weather stations.

Official records minimise the effect of UHI, assuming the additional heat is primarily a result of the radiative green-house effect (RGHE).  Rather than eliminating poor-quality sites, the NASA GISSTEMP (Goddard Institute for Space Studies official temperature data set) has opted, with the help of algorithms, to “homogenise” the raw data, adding a variety of adjustment factors to historical figures.

The intent of “Homogenisation” is to fill-in missing and “improve” poor quality data by comparing readings from nearby weather stations to identify and “eliminate” missing data or other measurement errors.  This would be really useful if surrounding weather stations were accurate.  Instead, Homogenisation is mixing the clean records with dirty ones, muddying the whole pond.

TO BE CONTINUED

Leave a Reply