“under the Intermediate Scenario”
Tony Heller | September 24, 2024
NOAA has launched a new sea level website which is based on unsupportable claims and appeals to authority.
Met Office Must Answer Growing Doubts About Rising U.K. and Global Temperature Claims
BY CHRIS MORRISON | THE DAILY SCEPTIC | AUGUST 9, 2022
Further legitimate doubts are being raised about the scale of global warming claimed by the U.K. Met Office, following publication of a damning report into U.S. weather stations. The report found that 96% of the weather stations used by the U.S. weather service NOAA were “corrupted” by the localised effects of urbanisation. The U.S. has one of the largest temperature measuring systems in the world, and information from the stations forms an important part of the Met Office HadCRUT5 database.
Since 2013, the Met Office has boosted recent global warming by 30%, depressed past measurements and abolished the temperature pause from 1998 to 2012 – this pause is still discernible in the accurate satellite and meteorological balloon record. Using the HadCRUT5 database means the Met Office can claim continuing warming and further heat records. Anthony Watts, the author of the report, titled Corrupted Climate Stations, noted that data from the stations that have not been corrupted by faulty placement, “show a rate of warming in the United States reduced by almost half compared to all stations”. With a 96% warm-bias in U.S. temperature measurements, “it is impossible to use any statistical methods to derive an accurate climate trend for the U.S.”, added Watts. The same can, of course, be argued to apply to all global sets that use the corrupted U.S. data.
The corruption is caused by close proximity to asphalt, machinery and other heat-producing, heat-trapping, or heat-accentuating objects. “Placing temperature stations in such locations violates NOAA’s own published standards, and strongly undermines the legitimacy and magnitude of the official consensus on long-term climate warming trends in the United States,” it says.
Of course the Met Office’s own U.K. temperature measuring is subject to considerable urban heat distortions. During the recent brief heatwave (“feels like an apocalypse,” Piers Morgan), three of the four highest temperatures were recorded at airports including Heathrow, one of the least suitable sites it is possible to imagine. Interestingly, the average temperature for the U.K. last month was 16.6°C, the same as the year before and nearly identical to the 16.5°C of 1976. Given that 11 million more people live in the U.K. and urbanisation has rapidly expanded since then, last month was almost certainly cooler than the same glorious period in 1976. In addition, these averages were not far off the temperature of 16°C recorded in 1911.
Frequent upwards adjustments to HadCRUT, and an increasing disconnect with satellite and balloon records, do pose legitimate questions that the state-funded Met Office is actually recording increasing urban heat, and not much warming of the global atmosphere. And further questions can be posed along the lines – is it just a coincidence that the data is beneficial to those arguing the climate is breaking down, and a command-and-control Net Zero solution must be imposed in less than 30 years?
As we reported recently in the Daily Sceptic, Watts also publicised a rarely referenced dataset that NOAA started in 2015, designed to remove all urban heat distortions. Called the U.S. Climate Reference Network (USCRN), it collected data from 114 U.S. stations and was aiming for “superior accuracy and continuity in places that land use will not likely impact during the next five decades”. Over the last 17 years it found very little evidence to indicate a warming trend. In fact it showed that May 2022 was cooler than May 2005. Watts comments that the data the network produces are never mentioned in monthly or yearly climate reports published by NOAA for public consumption.
Much of the Watts report supplies details of the field trips made to NOAA stations. He supplies copious notes and photos of what was found.
The above photo was taken at Fort Pierce in Florida and shows a digital measuring devise (MMTS) sited next to a large building and five air conditioning units pumping out hot air. Watts, a meteorologist by profession, notes that digital devices are often placed next to buildings since installing a cable to a reading devise is more difficult when roads and paths have to be crossed.
Several examples of stations where the siting could be described as “absurd” were noted in the survey. Watts gives further details:
These include a GHCN station at Lava Hot Springs, Idaho – a tourist site at which the MMTS sensor was placed into a natural hole in the ground where hot water for bathing and swimming emanates from the ground: … a station in Virginia City, Nevada – at which the MMTS was not only missing its protective cap, but also placed near asphalt, generators, and air conditioning units exhausts. Perhaps the most absurd was a UNHCN station in Colfax, California, which was recently moved due to a modernisation upgrade at the California fire station where it is located. The new station has been placed directly above a 20-foot rock wall that absorbs a massive amount of solar energy during the day, and releases it as LWIR [Long Wave Infrared] at night, with heated air rising to the sensor.
In conclusion the report found a “slight warming trend” when examining temperatures from “unperturbed” stations and this was similar to the satellite record compiled by the University of Alabama in Huntsville (UAH). “This warming trend, however, is approximately half the claimed rate of increase promoted by many in the climate science community,” it was noted. The UAH monthly record is frequently published by the Daily Sceptic as providing the best guide to global temperature. Not only does it show clearly that temperatures paused from 1998-2012 but a current pause is underway, and this has lasted nearly eight years. The inconvenient data are not to everyone’s taste. Earlier this year, Google Adsense ‘demonitised’ the page providing the monthly results on the grounds of “unreliable and harmful claims”.
“The rate of warming as measured by unperturbed surface stations, USCRN and UAH does not represent a climate crisis,” says Watts. Meanwhile it is almost certain that as temperatures rise in the U.K. this week, the Met Office will be reporting from Heathrow. But its addiction to such data, shown to be “corrupted” by the Watts report, is leading to serious doubts about its ability to provide an accurate indication of U.K. and global temperatures.
Chris Morrison is the Daily Sceptic’s Environment Editor.
Top climate scientists slam global warming “so-called evidence” as “misrepresentation, exaggeration & outright lying”
By Chris Morrison | The Daily Sceptic | July 11, 2022
Two top-level American atmospheric scientists have dismissed the peer review system of current climate science literature as “a joke”. According to Emeritus Professors William Happer and Richard Lindzen, “it is pal review, not peer review”. The two men have had long distinguished careers in physics and atmospheric science. “Climate science is awash with manipulated data, which provides no reliable scientific evidence,” they state.
No reliable scientific evidence can be provided either by the Intergovernmental Panel on Climate Change (IPCC), they say, which is “government-controlled and only issues government dictated findings”. The two academics draw attention to an IPCC rule that states all summaries for policymakers are approved by governments. In their opinion, these summaries are “merely government opinions”. They refer to the recent comments on climate models by the atmospheric science professor John Christy from the University of Alabama, who says that, in his view, recent climate model predictions “fail miserably to predict reality”, making them “inappropriate” to use in predicting future climate changes.
The ’miserable failure’ is graphically displayed below. Since the observations cut-off, global temperatures have again paused.
Particular scorn is poured on global surface temperature datasets. Happer and Lindzen draw attention to a 2017 paper by Dr. James Wallace and others that elaborated on how over the last several decades, “NASA and NOAA have been fabricating temperature data to argue that rising CO2 levels have led to the hottest year on record”. The false and manipulated data are said to be an “egregious violation of scientific method”. The Wallace authors also looked at the Met Office HadCRUT database and found all three surface datasets made large historical adjustments and removed cyclical temperature patterns. This was “totally inconsistent” with other temperature data, including satellites and meteorological balloons, they said. Readers will recall that the Daily Sceptic has reported extensively on these issues of late and has attracted a number of somewhat footling ‘fact checks’.
Happer and Lindzen summarise: “Misrepresentation, exaggeration, cherry picking or outright lying pretty much covers all the so-called evidence marshalled in support of the theory of imminent catastrophic global warming caused by fossil fuels and CO2.”
Professors Happer and Lindzen’s comments are included in a submission to the U.S. Securities and Exchange Commission, which is seeking to impose massive and onerous ‘climate change’ reporting requirements on public companies. But they form part of a wider scientific revolt by many scientists alarmed at the corruption of science to promote the command-and-control Net Zero agenda. Needless to say, these debates are largely ignored by mainstream media. Opponents of Net Zero politicised science are denounced as ‘cranks’ and ‘deniers’, labels at odds with their distinguished scientific achievements. Between them, Happer from Princeton and Lindzen from MIT have around 100 years of involvement in atmospheric science. Richard Lindzen was an early lead author for the IPCC, while William Happer was responsible for a groundbreaking invention that corrected the degrading effects of atmospheric turbulence on imaging resolution.
In their submission, Happer and Lindzen supply a basic lesson in science: “Reliable scientific theories come from validating theoretical predictions with observations, not consensus, peer review, government opinions or manipulated data”.
In the U.K., it will be interesting to see if Net Zero will feature as a major issue in the battle to find a new Prime Minister. At the moment, candidates seem to be steering a widish berth – something that can happen with virtuous green policies when actual votes are at stake. Happer and Lindzen state firmly that “science demonstrates there is no climate-related risk caused by fossil fuels and CO2, and therefore no reliable scientific evidence supporting the proposed rule”. The rule in this case refers to the SEC climate requirement, but it could equally apply to Net Zero. Many people now accept that a rigid Net Zero policy will lead to massive falls in living standards that will disproportionately affect the poorer in society, both in the U.K. and particularly in the developing world. Contrary to the incessant attack on fossil fuels, write Happer and Lindzen, “affordable, abundant fossil fuels have given ordinary people the sort of freedom, prosperity and health that were reserved for kings in ages past”.
Such prosperity, of course, has left the building in the case of Sri Lanka, where the prospect of famine and civil breakdown face 22 million people following (among other things) the decision of the Government to ban fertiliser in the interests of climate change and saving the planet. Such a collapse, with the President hastily fleeing the country, is likely to face any modern Net Zero society that seeks to tamper with reliable and affordable energy supply, restrict diet and try to grow enough food using ‘organic’ methods. Happer and Lindzen state that reducing CO2 and the use of fossil fuels would have “disastrous consequences” for the poor, people worldwide and future generations.
Both Happer and Lindzen have long held out against the current demonisation of atmospheric CO2, pointing out that the current 415 parts per million (ppm) is near a record low and not dangerously high. They note that 600 million years of CO2 and temperature data “contradict the theory that high levels of CO2 will cause catastrophic global warming”. Omitting unfavourable data is an egregious violation of scientific method. Facts omitted by those who argue there is a climate emergency include that CO2 levels were over 1,000 ppm for hundreds of millions of years and have been as high as over 7,000 ppm; CO2 has been declining for 180 million years from about 2,800 ppm to today’s low; and today’s low is not far above the minimum level when plants die of CO2 starvation, leading to all other life forms perishing for lack of food.
Finally, the authors note that the logarithmic influence of CO2 means its contribution to global warming is “heavily saturated”. The scientists calculate that a doubling of current CO2 levels would only reduce the heat escaping to space by about 1.1%. This suggests warming of around 1°C or less. The saturation hypothesis explains, they say, the disconnect between CO2 and temperature observed over 600 million years.
The Greatest Scientific Fraud Of All Time — Part XXVIII
By Francis Menton | Manhattan Contrarian | August 26, 2021
What I refer to as the “Greatest Scientific Fraud Of All Time” is the systematic alteration of historical world temperatures to make it appear, falsely, that the most recent months and years are the “warmest ever.” The basic technique of the fraud is the artificial lowering of previously-reported data as to world temperatures in earlier years, in order to erase earlier warmth and amplify the apparent warming trend. This is the 28th post in this series. The previous post in the series appeared on October 5, 2020. To view all 27 prior posts, you can go to this composite link.
The deliverable products of the temperature fraudsters are purported charts of world temperatures derived from a thermometer-based surface record (called GHCN, or Global Historical Climate Network), generally going back to about 1880. The charts are engineered to appear in an iconic “hockey stick” shape, with relatively flat earlier years followed by a sharply rising “blade” in the most recent years.
Every few years the government (this is a joint effort of NASA and NOAA) comes out with a new version of these data. The latest version is called GHCN version 4, which began in 2018. Here is a chart from the Columbia University website (the NASA branch involved in this project, known as the Goddard Institute of Space Studies, is located on the Columbia campus in uptown Manhattan) showing a side-by-side comparison of the version 3 and version 4 GHCN data. Both show the famous hockey stick shape, although version 4 increases the recent uptick somewhat.
My October 5, 2020 post mainly summarized a piece by Tony Heller that had appeared on October 1 of that year. Heller’s piece focused specifically on alterations to the temperature record of the U.S., as opposed to the entire world. Heller provided links to earlier and later NASA/GISS data reports, clearly showing that temperatures originally reported for earlier years had subsequently been lowered to enhance the warming trend and to make the most recent years appear to be the “warmest” — in spite of the fact that if temperatures previously reported had been correct, then earlier years including 1953, 1934, and 1921 had actually been warmer than the most recent years.
Heller also noted, as I have many times, that NASA and NOAA make no secret of the fact that they are systematically altering and lowering earlier-year temperatures,
Reality is that the data alterations are no secret, and that NOAA and NASA acknowledge that they do it.
The problem is not that the alterations are a secret, but that they are opaque. You would think that it would be impossible for earlier-year temperatures to change at all, let alone that they would systematically change in a way that just happens to enhance the desired narrative of the promoters of the global warming scare. The justifications for the alterations appear to be just so much bafflegab, completely lacking in specific rationales for each change that you would think would be required — particularly given that these temperature charts are being used as a basis for a multi-trillion dollar fundamental transformation of the world energy economy.
Anyway, into this mix now comes a young Japanese woman named Kirye, who has taken up the Heller tradition of compiling and publishing instances of government alteration of the data that underlie the NASA/NOAA temperature charts. Kirye posts periodically on Heller’s website, known as RealClimateScience, and also at the NoTricksZone site. A couple of days ago (August 24) Kirye had a post at NoTricksZone titled “Adjusting To Warm, NASA Data Alterations Change Cooling To Warming In Ireland, Greece.” Adding to Heller’s work, this post goes outside the U.S. to look at two European countries that ought to have good and reliable temperature data. The post specifically focuses on the period 1988 to present, which is the period of the supposed sharp uptick in temperatures represented by the “blade” of the hockey stick in the NASA/NOAA charts above.
What Kirye finds is that in both Ireland and Greece, NASA and NOAA have altered the data to turn a cooling trend into a warming trend for the 1988-2020 period. Here is her comparison of the “unadjusted” data for Ireland compared to the “GHCN version 4” currently being reported:
Kirye gives a link for these graphs to the NASA/GISS website. That is where she got the information. The NASA/GISS site has a map of the world with a little dot for each station, and if you click on any station you can get a plot courtesy of NASA that shows both the “unadjusted” and “version 4” temperature series for that station. Kirye has taken both versions straight from NASA itself. It’s just that only when you combine and present the data the way Kirye does do you realize that the bureaucrats have systematically altered the temperature trend for an entire country from down to up. Suddenly you clearly see that the entire apparent upward trend consists of unspecified “adjustments.” The same applies for both Ireland and Greece.
Can they even attempt to justify what they have done? At the same NASA/GISS page linked by Kirye, I find a further link saying “For details see FAQ.” Maybe I can find the answer here? So I followed that link, and another, and come to the end of my road at this document titled “FAQs on the Update to Global Historical Climatology Network–Monthly Version 3.2.0.” This document specifically relates to the version of GHCN just preceding version 4, but I have no reason to think that the basic methodology has changed. Here is an extremely revealing “FAQ” with the relevant part of its answer:
Why is the century‐scale global land surface trend higher in version 3.2.0?
The PHA software is used to detect and account for historical changes in station records that are caused by station moves, new observation technologies and other changes in observation practice. These changes often cause a shift in temperature readings that do not reflect real climate changes. When a shift is detected, the PHA software adjusts temperatures in the historic record upwards or downwards to conform to newer measurement conditions. In this way, the algorithm seeks to adjust all earlier measurement eras in a station’s history to conform to the latest location and instrumentation. The correction of the coding errors greatly improved the ability of the PHA to find these kinds of historic changes. As a result, approximately twice as many change points (inhomogeneities) were detected in v3.2.0 than in v3.1.0. . . .
Study that a little bit and think about what they are saying. There can be “station moves” or “new observation technologies” that can cause a “shift in temperature readings.” Fair enough. So has anybody contacted any of the Irish stations to find out if they have had a “station move” or “new observation technology” or anything like that since 1988? Absolutely not! Instead, they have a computer algorithm detect these things — or maybe invent them. The algorithm supposedly looks for “shifts.” So suppose readings at a particular station have somehow shifted to lower temperatures. Could it be that temperatures are reading lower because it got cooler? Obviously that does not fit the narrative. Time to declare a “shift.” Now, instead of reporting the cooling trend that is coming from the thermometers, you can adjust the earlier temperatures downward to reflect “new observation technology” or some such never-specified thing.
Note on Kirye’s dynamic graph that every single one of the stations in Ireland has had its trend adjusted from down to up by these computer algorithms. Did they all have station moves and/or “new observation technologies”? NASA doesn’t even pretend to have checked.
Take a look also at the “unadjusted” Irish plots on Kirye’s graph. Can you spot the supposed “shifts” that support having some computer come in and re-write the earlier temperatures to make the overall trend change from down to up?
At the end of the linked NASA document is a further link where you can supposedly get the computer code used for making what they call the “homogeneity corrections.” However, when I try that I don’t get anything I can open.
Anyway, this is what passes for “science” in the field of climatology.
Increasing Hurricane Intensity Study Fatally Flawed
By Paul Homewood | Not A Lot Of People Know That | January 30, 2021
Last May this story was being widely covered:
Stronger, deadlier and more frequent — that’s the trend scientists at the National Oceanic and Atmospheric Administration (NOAA) have seen in the past few decades, and they expect that trend to continue in the years to come, according to a new study.
Researchers at the University of Wisconsin and NOAA analyzed satellite data of tropical cyclones over the last 40 years and found category 3, 4 and 5 hurricanes were becoming increasingly common, CNN reported. Decade after decade, the likelihood of major global storms has increased, according to CNN.
“The change is about 8% per decade. In other words, during its lifetime, a hurricane is 8% more likely to be a major hurricane in this decade compared to the last decade,” James Kossin, author of the study, told CNN.
https://www.miamiherald.com/news/nation-world/national/article242827051.html
The statistician William Briggs published a rebuttal on his website this week, which was written by Greg Kent and attacked the statistical basis of the Kossin study. You can read it here.
The study looks at the period 1979 to 2017, and compares 1979-1997 with 1998-2017
Kent makes one crucial observation, without realising its true significance:
The pervasive erroneous calculations in the original paper and the invalid claim of statistical significance are not the only issues with Kossin et al. There is also reason to question whether the 10% increase in the proportion of major hurricane force winds was a global or largely regional phenomenon. Kossin et al presented results for each of the hurricane basins around the world. The data shows that the global results are driven largely by a single basin, the North Atlantic. The proportion of major wind speeds increased by 72% in the North Atlantic, far more than in any other hurricane basin. Western Pacific, which accounts for over 40% of the major hurricane force winds over the last 4 decades, showed a smaller proportion of intense storms in the later period (indicating a negative change). The other basins either showed no change at all between periods or the change was so small as to fail tests of statistical significance at traditional levels of confidence.
There is actually a very good reason why there have been more intense hurricanes since 1998 than before – the Atlantic Multidecadal Oscillation or AMO. Here’s what NASA have to say about the AMO:
https://www.aoml.noaa.gov/phod/faq/amo_faq.php
The AMO was in cold phase between 1979 and 1995, and has been warm ever since. So the increase in hurricane intensity has nothing whatsoever to do with “climate change”, and instead is a consequence of natural ocean cycles.
In any other field of science, peer review would have spotted this fatal flaw in Kossin’s paper, which would never have been published.
The Greatest Scientific Fraud Of All Time — Part XXVII
By Francis Menton | Manhattan Contrarian | October 5, 2020
It has been more than a year since I last added a post to this series. The previous post in the series, Part XXVI, appeared on August 20, 2019. For all of the prior twenty-six posts, go to this composite link.
There are two reasons for a new post at this time. The first is that there is some new work out from a guy named Tony Heller. The new work can be found at Heller’s website here, with a date of October 1. Heller also indicates that he intends to continue to add to and supplement this work. Heller is an independent researcher who particularly focuses on the subject of this series: alterations to past officially-reported government climate data to create an impression of warming that did not exist in the data as originally reported. Heller is quite skilled at going through reams of government climate data, and turning those data into useful graphs to demonstrate his points. However, in the past I have sometimes been frustrated with Heller’s work for not including sufficient links to enable a reader to verify that his assertions about data alteration are correct. Thankfully, in the current piece, Heller has corrected that issue, and provides the links so that you can see for yourself that the government has changed the data it previously reported in order to artificially enhance the apparent warming trend.
The second reason for a new post at this time is that President Trump has — finally! — hired two climate skeptics into positions of authority over the bureaucracy that compiles, and later alters, the climate data. On September 12, Trump named David Legates to the position of Deputy Assistant Secretary of Commerce for Observation and Prediction. And on September 21, Trump named Ryan Maue as Chief Scientist at the National Oceanic and Atmospheric Administration (NOAA). NOAA is the main bureaucracy where the principal climate data are compiled, and is a part of the Department of Commerce. (Another agency, NASA, is also involved in these efforts.). Both Legates and Maue have been known as people who refuse to accept much of official climate orthodoxy. It is completely bizarre that these appointments would only occur less than two months before the election that could turn Trump out of office, but there you go.
Heller’s October 1 piece, titled “Alterations To The US Temperature Record,” is one of the most thorough and careful that he has done on this subject. Note that the piece only deals with the temperature records of the US, not the entire world. The temperature records of the US and of the rest of the world present very different issues for researchers trying to assess the accuracy of government-reported warming trends. For the rest of the world, no contemporaneously-generated data exist for most of the surface area and for most of the time period between the late nineteenth century and now. Before the recent years, there just were no (or very few) measurement stations or instruments for vast regions like the oceans, the Southern Hemisphere, Africa, Siberia, and so forth. Therefore, for those and other areas, much of what passes for historical temperature data, particularly from about 1880 to 1960, has actually been created or interpolated after the fact by computer algorithms, which then just so happen to show the trend that the programmers and their bosses would like to see. But for the US, the situation is different. For the entire period back to the late nineteenth century, there has existed a dense network of ground thermometers to record temperatures throughout this country. Therefore, if prior reported data showed cooling trends, and now you want to report a warming trend, that necessarily requires changing prior reported data. Heller:
The US temperature is very important, because the vast majority of stations which NOAA has long-term daily temperature data for are located in the US.
So, have prior officially-reported US temperatures been altered to create and enhance warming trends? The answer is absolutely, clearly, yes. If you haven’t followed this series prior to now, you may be surprised to learn that fact. Remarkably, as Heller points out, NOAA, and its co-bureaucracy NASA, do not deny that they have altered the data, and don’t even make serious efforts to hide the fact. Heller:
Reality is that the data alterations are no secret, and that NOAA and NASA acknowledge that they do it.
It’s not that the alterations are secret, but rather that the bureaucrats make it as difficult as possible to track the alterations, to learn the basis for the alterations, and to figure out what has changed and by how much. Periodically, new versions of data sets are issued, with no detailed documentation of what has changed or on what basis. When NOAA and NASA come out with their latest breathless press release about the “hottest year ever,” and so forth, there is no mention of prior officially-reported data that would contradict the claim. Often earlier data have simply been written over as new, altered data are substituted, making it impossible to track the changes unless you happen to be fortunate enough to have captured a screenshot of the old data before it got modified.
Nevertheless, there are notable examples where the prior data continue to be accessible, and Heller has done some yeoman’s work to compile a number of damning instances. I urge you to read his whole piece, but I’ll give you here what is undoubtedly the most notable and shocking example. In 1999, then NASA/GISS head James Hansen, a noted climate alarmist, came out with a big research paper titled “GISS analysis of surface temperature change.” Heller links to this paper in his piece, and you can see from the url that it is an official NASA document. The paper was part of the then growing climate alarm movement at the time, and contained a collection of claims designed to scare you out of your wits about impending climate change apocalypse. Examples from the abstract:
The rate of temperature change was higher in the past 25 years than at any previous time in the period of instrumental data. The warmth of 1998 was too large and pervasive to be fully accounted for by the recent El Nino. . . .
And so on and on. But Hansen made the mistake of including in the paper a graph of the official NASA temperature data for the US from 1880 to 1999, as it existed at that time. You can find that graph as Exhibit 6 to the 1999 paper. Here it is:
What jumps off the page — and what Heller drives home with his red circles — is that 1934 is the warmest year, approximately 0.6 deg C (or one full degree F) warmer than 1998, which in fact is only the fifth warmest year on this chart, also trailing 1921, 1931, and 1953.
But today NASA has a new chart up on its website, with data through 2019, supposedly generated out of the same data base, but just a new and improved “version” of same. You can go to that link to see NASA’s full chart through 2019, and to verify that this is in fact an official NASA chart. But Heller takes the step of truncating this 2019 chart at 2000 to emphasize the comparison to NASA’s prior chart that went to 1999. Here is the 2019 NASA chart truncated to 2000:
Now 1998 is notably warmer than 1934, and for that matter, also warmer than 1921, 1931 and 1953. The earlier years in the chart have all gotten cooler and the later years all warmer. A declining trend in temperatures from the 1930s to the 1990s has been turned into a warming trend.
How did that happen? What is the basis for the alterations? You will never get that answer out of NOAA or NASA.
Go through Heller’s post to see other examples of earlier and later NASA and NOAA temperature charts, for instance for the state of Texas, or for average daily high temperatures for the full US. Somehow, in each case, cooling trends have been turned into strong warming trends, particularly from the 1930s to 1990s.
And finally, Heller’s pièce de résistance: He calculates the quantitative alterations in the data for each year, and demonstrates that the effect of the alterations is to make the temperature graph match near-perfectly to the changing level of CO2 in the atmosphere. The data have been altered to fit the hypothesis. Heller:
The implication of this is that the huge adjustments being made to the US temperature record are being made to match global warming theory, which is the exact opposite of how science should be done. The unadjusted data shows essentially no correlation between CO2 and temperature.
So, Messrs. Legates and Maue, you now have at least a few months to blow the lid off this scandal. Really, that is all the time it should take. The American people deserve to have an honest accounting of what is going on. Now is our chance.
Hotter than the hottest thing ever
Climate Discussion Nexus | January 22, 2020
So 2019 was hotter than anything ever was hot, except 2016 which was itself the hottest thing ever. We’re all going to die! Unless we don’t because it wasn’t. As Anthony Watts observes, if you measure from the depths of the natural Little Ice Age you get an upward line. But if you take a longer perspective you get ups and downs, within which our era is not remarkable. Even worse, as Watts also shows on a graph, the most credible numbers from the United States, which has the best temperature measurements in the world, show 2019 as cooler than 2005… and 2006… and 2007, 2010, 2011, 2012, 2015, 2016, 2017 and 2018. But hey, who’s counting?
Alarmists frequently assert that they rely on science whereas “deniers” rely on oil money and slippery rhetoric. But in addition to the contradictions between reasonably complete American temperature records (that, among other things, show the number of really hot days falling over the past century) and very patchy records from most of the rest of the planet, Watts raises some very basic statistical issues that the Armageddon types do not seem eager to discuss.
For instance, Watts’ Jan. 15 post objects to suspect statistical selectivity in the findings. Particularly glaring is an inconsistent baseline for comparisons because NASA’s Goddard Institute for Space Studies (GISS) clings to the coolest available period (1951-80, though without wishing to discuss why there was a cooling from around 1920 even as the atmospheric CO2 that supposedly drives temperature increased) whereas the National Oceanographic and Atmospheric Administration (NOAA), equally alarmist in its views, uses 1981-2000.
His Jan. 17 post makes another point that deserves far more attention than it usually gets. He takes aim at “a press release session that featured NOAA and NASA GISS talking about how their climate data says that the world in 2019 was the second warmest ever, and the decade of 2010-2019 was the hottest ever (by a few hundredths of a degree).” But as every competent statistician knows, results can never be more accurate than inputs. And since nobody claims to be measuring temperature in hundredths of a degree outside a laboratory, there must be a lot of people within NOAA and NASA writhing in shame at this claim.
It gets worse. As we were told in high school math, and some of us even listened, if you measure two things to one decimal place and multiply them correctly, you may very well get a number with two decimal places. Thus 0.5 times 0.5 is 0.25. And that second decimal place yields an apparent increase in precision. But it’s worse than apparent, it’s deceptive, unless you know the two factors are exactly right. If I give you exactly half of a buck and a half, that is, exactly 0.5 times 1.5 dollars, I give you exactly 75 cents. But if the two factors are just estimates, if I try to split the leftover doughnut and a half from the meeting evenly between us, giving you about .5 times roughly 1.5, it is fatuous to say you got exactly .75 of a doughnut which beats the measly .73 you had last week.
The right procedure in such cases is not to keep two decimal places or even one. It is to round it to a whole number to accommodate the growing uncertainty as you combine uncertainties. “I got most of a stale doughnut again” is the best way to characterize what happened.
Such spurious precision is a chronic feature of climate science as of a great many things in the modern world. Thus David Middleton mocks a publication called The Anthropocene for asserting that death will get worse due to climate change including “an additional 1,603 deaths from injuries each year in the United States”; as Middleton rightly asks, “Are they sure it’s not 1,602 or 1,604?” And since the actual piece said “Global warming of 1.5 °C could result in an additional 1,603 deaths from injuries each year in the United States, an international team of researchers reported yesterday in the journal Nature Medicine” there’s an Ossa of estimated temperature rise beneath a Pelion of “could result” medical modeling that ought to have shamed the authors into saying “about 1,500”.
When it comes to global temperature, no sane person would ever claim to have measured the temperature anywhere outside a laboratory within a few hundreds of a degree. So there is no possible way that we know the temperature of the entire Earth, most of which has no temperature stations at all, to within even a few tenths of a degree let alone a few hundredths.
Putting all this legerdemain together, if that press release that galloped around the world while the statistics were pulling on their boots was not a lie then, to borrow a phrase from Damon Runyon’s Guys and Dolls, it will do until a lie comes along.