From CEI Worth reading

From CEI

Bjorn Lomborg and John Christy Shred Climate Alarmism

Marlo Lewis, Jr. • June 6, 2019

The latest talking point of progressive politicians, pundits, and activists is that America cannot afford not to spend trillions of dollars to “solve the climate crisis” because global warming is an existential threat. As Sen. Bernie Sanders (I-VT) put it, “You cannot go too far on the issue of climate change. The future of the planet is at stake, OK?”

Abysmal Benefit-Cost Ratio

That is sham wisdom even if climate change were the terror Sen. Sanders imagines it to be. The resources available to public and private decision makers are finite. Resources allocated to “climate action” are no longer available to make mortgage payments, pay college tuitions, grow food, fund medical innovation, or build battleships. Prudent policymakers therefore not only consider the costs of policy proposals but also compare the different benefit-cost ratios of competing expenditures. As it happens, the benefit-cost ratios of carbon suppression policies are abysmal.

For example, just the direct expense of the electric sector portion of the Green New Deal would, conservatively estimated, cost $490.5 billion per year, or $3,845 per year per household, according to American Enterprise Institute economist Benjamin Zycher. Yet even complete elimination of U.S. greenhouse gas emissions would avert only 0.083°C to 0.173°C of global warming 70 years from now—a policy impact too small to discernibly affect weather patterns, crop yields, polar bear populations, or any other environmental condition people care about.

The climate “benefit” over the next 10 years would be even more miniscule. Yet during that period, Zycher estimates, the annual economic cost of the GND electric sector program would be about $9 trillion. It is unwise to spend so much to achieve so little.

No Planetary Emergency

The doomsday interpretation of climate change is a political doctrine, not a scientific finding, as Danish economist Bjorn Lomborg shows in a recent series of tweets and University of Alabama in Huntsville atmospheric scientist John Christyexplains in a new paper titled “Falsifying Climate Alarm.”

In the aforementioned tweets, Lomborg rebuts an op-ed by Nobel economist Joseph Stigletz, who advocates spending trillions of dollars annually to combat climate change, which he calls “our World War III.” As evidence, Stigletz claims that in recent years weather-related damages cost the U.S. economy 2 percent of GDP—a figure for which he gives no reference.

Lomborg deftly sets the record straight. Aon Benfield reinsurers estimate that during 2000-2017, weather-related damages cost the United States about $88 billion annually, or 0.48 percent of GDP per year, not 2 percent. More importantly, extreme weather is a natural feature of the Earth’s climate system. The vast majority of those damages would have occurred with or without climate change. “Does Stiglitz believe there is no bad weather without climate change?” Lomborg asks.

In the United States, hurricanes are the biggest cause of weather-related damages. Hurricanes have become more costly over the past 120 years but not because of any long-term change in the weather. Once historic losses are adjusted for increases in population, wealth, and the consumer price index, U.S. hurricane-related damages show no trend since 1900.

The past three decades are generally agreed to be the warmest in the instrumental record. Yet during that period, damages due to all forms of extreme weather as a share of global GDP declined. In other words, despite there being many more people and lots more stuff in harm’s way, the relative economic impact of extreme weather is decreasing. It is difficult to reconcile that trend with claims that ours is an “unsustainable” civilization.

Lomborg provides an even more telling rebuttal point in a previous Tweet. Since the 1920s, atmospheric carbon dioxide (CO2) concentrations increased from about 305 parts per million to more than 400 ppm, and global average temperatures increased by about 1°C. Yet globally, the individual risk of dying from weather-related disasters declined by 99 percent.

Stigletz claims we cannot afford not to spend trillions to mitigate climate change because “our lives and our civilization as we know it is at stake, just as they were in World War II.” Lomborg notes that in the peer-reviewed literature, unchecked climate change is estimated to cost 2-4 percent of global GDP in 2100. That “is not the end of the world,” especially considering that, despite climate change, global per capital incomes in 2100 are expected to be 5-10 times larger than today.

Ironically, in the “socio-economic pathways” (SSPs) literature, the richest SSP is the one that relies most on free markets and fossil fuels.

Source: Keywan Rhiahi et al. 2017. “This world [SSP5] places increasing faith in competitive markets, innovation and participatory societies to produce rapid technological progress and development of human capital as the path to sustainable development. . . . At the same time, the push for economic and social development is coupled with the exploitation of abundant fossil fuel resources and the adoption of resource and energy intensive lifestyles around the world.”   

John Christy’s new paper, published by the Global Warming Policy Foundation, summarizes two of his recent peer-reviewed studies. In 2017, Christy and fellow atmospheric scientist Richard McKnider examined 37.5 years of satellite data in the global troposphere (bulk atmosphere). Christy and McNider factored out the warming effects of El Ninõ and the cooling effects volcanic aerosol emissions. The underlying greenhouse warming trend—the dark line (e) in the figure below—is 0.095°C per decade, or about one-fourth the rate forecast by former NASA scientist James Hansen, whose congressional testimony launched the global warming movement in 1988.

Christy and McNider estimate that when atmospheric carbon dioxide concentrations double, global warming will reach 1.1°C—a quantity called “transient climate response.” Christy comments:

This is not a very alarming number. If we perform the same calculation on the climate models, you get a figure of 2.31°C, which is significantly different. The models’ response to carbon dioxide is twice what we see in the real world. So the evidence indicates the consensus range for climate sensitivity is incorrect.

In 2018, Christy and economist Ross McKitrick set out to test the accuracy of climate models. They examined model projections in the atmosphere between 30,000 and 40,000 feet, in the tropics from 20°N to 20°S. The atmosphere warms fastest in that portion of the atmosphere in almost all models used by the U.N. Intergovernmental Panel on Climate Change (IPCC), such as the Canadian Climate Centre model, shown below.

In 102 model runs, the average warming in the “hot spot” portion of the tropical atmosphere is 0.44°C per decade, or 2°C during 1979-2017. “However, the real-world warming is much lower; around one third of the model average,” Christy reports.

Christy sums up the test results:

You can also easily see the difference in warming rates: the models are warming too fast. The exception is the Russian model, which has much lower sensitivity to carbon dioxide, and therefore gives projections for the end of the century that are far from alarming. The rest of them are already falsified, and their predictions for 2100 can’t be trusted. If an engineer built an airplane and said it could fly 600 miles and the thing ran out of fuel at 200 and crashed, he wouldn’t say ‘Hey, I was only off by a factor of three’. We don’t do that in engineering and real science. A factor of three is huge in the energy balance system. Yet that’s what we see in the climate models.

Statements like the following are increasingly common in popular media, academic journals, and political discourse: “The evidence that anthropogenic climate change is an existential threat to our way of life is incontrovertible.” Not so—not even close.

The Problem with Computer Models

Below are studies dealing with the problems of computer modeling and figuring climate change.

Collins et al., 2018     Here there is a dynamical gap in our understanding. While we have conceptual models of how weather systems form and can predict their evolution over days to weeks, we do not have theories that can adequately explain the reasons for an extreme cold or warm, or wet or dry, winter at continental scales. More importantly, we do not have the ability to credibly predict such states. Likewise, we can build and run complex models of the Earth system, but we do not have adequate enough understanding of the processes and mechanisms to be able to quantitatively evaluate the predictions and projections they produce, or to understand why different models give different answers. … The global warming ‘hiatus’ provides an example of a climate event potentially related to inter-basin teleconnections. While decadal climate variations are expected, the magnitude of the recent event was unforeseen. A decadal period of intensified trade winds in the Pacific and cooler sea surface temperatures (SSTs) has been identified as a leading candidate mechanism for the global slowdown in warming.

Christy et al., 2018     [A]s new versions of the datasets are produced, trend magnitudes have changed markedly, for example the central estimate of the global trend of the mid-troposphere in Remote Sensing System’s increased 60% from +0.078 to +0.125°C decade−1, between consecutive versions 3.3 and 4.0 (Mears and Wentz 2016). … As an experiment, Mears et al. recalculated the RSS overall trend by simply truncating NOAA-14 data after 1999 (which reduced their long-term trend by 0.02 K decade−1). However, this does not address the problem that the trends of the entire NOAA-12 and −14 time series (i.e. pre-2000) are likely too positive and thus still affect the entire time series. Additionally, the evidence from the Australian and U.S. VIZ comparisons support the hypothesis that RSS contains extra warming (due to NOAA-12, −14 warming.) Overall then, this analysis suggests spurious warming in the central estimate trend of RSS of at least +0.04°C decade−1, which is consistent with results shown later based on other independent constructions for the tropical belt. … When examining all of the evidence presented here, i.e. the correlations, magnitude of errors and trend comparisons, the general conclusion is that UAH data tend to agree with (a) both unadjusted and adjusted IGRA radiosondes, (b) independently homogenized radiosonde datasets and (c) Reanalyses at a higher level, sometimes significantly so, than the other three [NOAA, RSS, UW]. … One key result here is that substantial evidence exists to show that the processed data from NOAA-12 and −14 (operating in the 1990s) were affected by spurious warming that impacted the four datasets, with UAH the least affected due to its unique merging process. RSS, NOAA and UW show considerably more warming in this period than UAH and more than the US VIZ and Australian radiosondes for the period in which the radiosonde instrumentation did not change. … [W]e estimate the global TMT trend is +0.10 ± 0.03°C decade−1. … The rate of observed warming since 1979 for the tropical atmospheric TMT layer, which we calculate also as +0.10 ± 0.03°C decade−1, is significantly less than the average of that generated by the IPCC AR5 climate model simulations. Because the model trends are on average highly significantly more positive and with a pattern in which their warmest feature appears in the latent-heat release region of the atmosphere, we would hypothesize that a misrepresentation of the basic model physics of the tropical hydrologic cycle (i.e. water vapour, precipitation physics and cloud feedbacks) is a likely candidate.

Abbott and Marohasy, 2018     While general circulation models are used by meteorological agencies around the world for rainfall forecasting, they do not generally perform well at forecasting medium-term rainfall, despite substantial efforts to enhance performance over many years. These are the same models used by the Intergovernmental Panel on Climate Change (IPCC) to forecast climate change over decades. Though recent studies suggest ANNs [artificial neural networks] have considerable application here, including to evaluate natural versus climate change over millennia, and also to better understand equilibrium climate sensitivity.

Guo et al., 2018    The snow‐albedo feedback is a crucial component in high‐altitude cryospheric change but is poorly quantified over the Third Pole, encompassing the Karakoram and Tibetan Plateau. … [I]t is noteworthy that the magnitude of the constrained strength is only half of the unconstrained model estimate for the Third Pole, suggesting that current climate models generally overestimate the feedback of spring snow change to temperature changebased on the unmitigated scenario.

Scafetta et al., 2018     The period from 2000 to 2016 shows a modest warming trend that the advocates of the anthropogenic global warming theory have labeled as the “pause” or “hiatus.” These labels were chosen to indicate that the observed temperature standstill period results from an unforced internal fluctuation of the climate (e.g. by heat uptake of the deep ocean) that the computer climate models are claimed to occasionally reproduce without contradicting the anthropogenic global warming theory (AGWT) paradigm. In part 1 of this work, it was shown that the statistical analysis rejects such labels with a 95% confidence because the standstill period has lasted more than the 15 year period limit provided by the AGWT advocates themselves. Anyhow, the strong warming peak observed in 2015-2016, the “hottest year on record,” gave the impression that the temperature standstill stopped in 2014.Herein, the authors show that such a temperature peak is unrelated to anthropogenic forcing: it simply emerged from the natural fast fluctuations of the climate associated to the El Niño-Southern Oscillation (ENSO) phenomenon. By removing the ENSO signature, the authors show that the temperature trend from 2000 to 2016 clearly diverges from the general circulation model (GCM) simulations. Thus, the GCMs models used to support the AGWT [anthropogenic global warming theory] are very likely flawed. By contrast, the semi-empirical climate models proposed in 2011 and 2013 by Scafetta, which are based on a specific set of natural climatic oscillations believed to be astronomically induced plus a significantly reduced anthropogenic contribution, agree far better with the latest observations.

Kundzewicz et al., 2018     Climate models need to be improved before they can be effectively used for adaptation planning and design. Substantial reduction of the uncertainty range would require improvement of our understanding of processes implemented in models and using finer resolution of GCMs and RCMs. However, important uncertainties are unlikely to be eliminated or substantially reduced in near future (cf. Buytaert et al., 2010). Uncertainty in estimation of climate sensitivity (change of global mean temperature, corresponding to doubling atmospheric CO2 concentration) has not decreased considerably over last decades. Higher resolution of climate input for impact models requires downscaling (statistical or dynamic) of GCM outputs, adding further uncertainty. … [C]limate models do not currently simulate the water cycle at sufficiently fine resolution for attribution of catchment-scale hydrological impacts to anthropogenic climate change. It is expected that climate models and impact models will become better integrated in the future. … Calibration and validation of a hydrological model should be done before applying it for climate change impact assessment, to reduce the uncertainty of results. Yet, typically, global hydrological models are not calibrated and validated. … Model-based projections of climate change impact on water resources can largely differ. If this is the case, water managers cannot have confidence in an individual scenario or projection for the future. Then, no robust, quantitative, information can be delivered and adaptation procedures need to be developed which use identified projection ranges and uncertainty estimates. Moreover, there are important, nonclimatic, factors affecting future water resources. … As noted by Funtowicz and Ravetz (1990), in the past, science was assumed to provide “hard” results in quantitative form, in contrast to “soft” determinants of politics, that were interest-driven and value-laden. Yet, the traditional assumption of the certainty of scientific information is now recognized as unrealistic and counterproductive. Policy-makers have to make “hard” decisions, choosing between conflicting options (with commitments and stakes being the primary focus), using “soft” scientific information that is bound with considerable uncertainty. Uncertainty has been policitized in that policy-makers have their own agendas that can include the manipulation of uncertainty. Parties in a policy debate may invoke uncertainty in their arguments selectively, for their own advantage.

Lean, 2018     Climate change detection and attribution have proven unexpectedly challenging during the 21st century. Earth’s global surface temperature increased less rapidly from 2000 to 2015 than during the last half of the 20th century, even though greenhouse gas concentrations continued to increase. A probable explanation is the mitigation of anthropogenic warming by La Niña cooling and declining solar irradiance. Physical climate models overestimated recent global warming because they did not generate the observed phase of La Niña cooling and may also have underestimated cooling by declining solar irradiance. Ongoing scientific investigations continue to seek alternative explanations to account for the divergence of simulated and observed climate change in the early 21st century, which IPCC termed a “global warming hiatus.” … Understanding and communicating the causes of climate change in the next 20 years may be equally challenging. Predictions of the modulation of projected anthropogenic warming by natural processes have limited skill. The rapid warming at the end of 2015, for example, is not a resumption of anthropogenic warming but rather an amplification of ongoing warming by El Niño. Furthermore, emerging feedbacks and tipping points precipitated by, for example, melting summer Arctic sea ice may alter Earth’s global temperature in ways that even the most sophisticated physical climate models do not yet replicate.

Hunziker et al., 2018     About 40% of the observations are inappropriate for the calculation of monthly temperature means and precipitation sums due to data quality issues. These quality problems undetected with the standard quality control approach strongly affect climatological analyses, since they reduce the correlation coefficients of station pairs, deteriorate the performance of data homogenization methods, increase the spread of individual station trends, and significantly bias regional temperature trends. Our findings indicate that undetected data quality issues are included in important and frequently used observational datasets and hence may affect a high number of climatological studies. It is of utmost importance to apply comprehensive and adequate data quality control approaches on manned weather station records in order to avoid biased results and large uncertainties.
Roach et al., 2018     Consistent biases in Antarctic sea ice concentration simulated by climate models … The simulation of Antarctic sea ice in global climate models often does not agree with observations. [M]odels simulate too much loose, low-concentration sea ice cover throughout the year, and too little compact, high-concentration cover in the summer. [C]urrent sea ice thermodynamics contribute to the inadequate simulation of the low-concentration regime in many models.
Scanlon et al., 2018     The models underestimate the large decadal (2002–2014) trends in water storage relative to GRACE satellites, both decreasing trends related to human intervention and climate and increasing trends related primarily to climate variations. The poor agreement between models and GRACE underscores the challenges remaining for global models to capture human or climate impacts on global water storage trends. … Increasing TWSA [total water storage anomalies] trends are found primarily in nonirrigated basins, mostly in humid regions, and may be related to climate variations. Models also underestimate median GRACE increasing trends (1.6–2.1 km3/y) by up to a factor of 8 in GHWRMs [global hydrological and water resource models] (0.3–0.6 km3/y). Underestimation of GRACE-derived TWSA increasing trends is much greater for LSMs [global land surface models], with four of the five LSMs [global land surface models] yielding opposite trends (i.e., median negative rather than positive trends) … Increasing GRACE trends are also found in surrounding basins, with most models yielding negative trends. Models greatly underestimate the increasing trends in Africa, particularly in southern Africa. .. TWSA trends from GRACE in northeast Asia are generally increasing, but many models show decreasing trends, particularly in the Yenisei. … Subtracting the modeled human intervention contribution from the total land water storage contribution from GRACE results in an estimated climate-driven contribution of −0.44 to −0.38 mm/y. Therefore, the magnitude of the estimated climate contribution to GMSL [global mean sea level] is twice that of the human contribution and opposite in sign.While many previous studies emphasize the large contribution of human intervention to GMSL [global mean sea level], it has been more than counteracted by climate-driven storage increase on land over the past decade. … GRACE-positive TWSA trends (71 km3/y) contribute negatively (−0.2 mm/y) to GMSL, slowing the rate of rise of GMSL, whereas models contribute positively to GMSL, increasing the rate of rise of GMSL.