You are currently browsing the tag archive for the ‘climate change’ tag.

Roads and trains were shut down across the New York area Monday night and into Tuesday, and for what? It snowed in New York, but only 9.8 inches fell in Central Park after predictions of a foot and a half or more. What went wrong? Forecasters, including yours truly, decided to go all-in on one weather model: the European model (or Euro).

And the Euro was way off. Other models had this storm pegged.1

Update after update, the Euro (produced by the European Center for Medium Range Weather Forecasting) kept predicting very high snow totals in New York. As of Monday morning’s run, the Euro was still projecting a foot and a half in the city. This consistency was too great for forecasters to ignore, especially because the Euro had been the first to jump on events such as the blizzard of 1996 and Hurricane Sandy. It also was one of the first to predict that a March 2001 storm was going to, like this one, be a bust. The Euro had a good track record.

That consistency, though, hid a great sense of uncertainty. The SREF (or Short-Range Ensemble Forecast), produced by the National Weather Service, collects 21 models (shown below). And Sunday night, the SREF indicated that the storm could be very different. Five of the 21 models in the SREF had (on a 10:1 snow-to-liquid ratio) less than 10 inches of snow falling. Nine of the 21 predicted a foot or less. Only eight could have been said to support 18 or more inches of snow in New York City.

screen-shot-2015-01-27-at-12-55-57-pm

In other words, 57 percent of the SREF members Sunday night suggested the forecasts were far too gung-ho. By Monday afternoon, 11 of the 21 members were on the 10-inches-or-less train. Eight of the 21 still supported big-time snow, but they were a minority.

The SREF members were not alone in being suspicious of so much snow. In Sunday’s 7 p.m. run, all of the other major models were against the Euro.

  • The American Global Forecasting System (GFS), which was recently upgraded, had only about 20 millimeters (or 8 inches of snow on a 10-to-1 ratio) falling for the storm. Although the GFS is considered inferior to the Euro by many meteorologists, the difference is probably overrated. Both models perform fairly well over the long term, as was pointed out in The New York Times this week. The GFS was showing the storm would stall too far northeast for New York to get the biggest snows. Instead, as we are seeing, those larger totals would be concentrated over Boston.
  • The GFS solution probably shouldn’t have been ignored given that it was joined by the Canadian’s global model, which had only 25 millimeters (or about 10 inches on a 10-to-1 ratio) falling as snow. The Canadian’s short-range model was slightly more pessimistic than the global. It predicted only about 20 to 25 millimeters (or 8 to 10 inches on a 10-to-1 ratio) of snow.
  • The United Kingdom’s model, which typically rates as the second-most accurate behind the Euro, was also on the little-snow train in New York. It had only 20 millimeters (or 8 inches on a 10-to-1 ratio) falling as snow.
  • Even the United States’ short-range North American Mesocale (NAM) model was on board with smaller accumulations, though it would change its tune in later runs and agree with the Euro for a time. On Sunday night, the NAM went with the 20 millimeters of snow.

Put it all together, and there was plenty of evidence this storm wouldn’t be record-setting in New York. Of course, forecasters are going to miss on occasion. Forecasting weather is very difficult. Models aren’t perfect, and forecasters should be practicing meteorology and not “modelology.”

That said, there are a few lessons to be learned:

  1. I’m not sure forecasters (including amateurs like myself) did a good enough job communicating to the public that there was great uncertainty in the forecast. This has been a problem for media forecasters who have historically been too confident in predicting precipitation events. A study of TV meteorologists in Kansas City found that when they predicted with 100 percent certainty that it would rain, it didn’t one-third of the time. Forecasters typically communicate margin of error by giving a range of outcomes (10 to 12 inches of snow, for example). In this instance, I don’t think the range adequately showed the disagreement among the models. Perhaps a probabilistic forecast is better.
  2. No model is infallible. Forecasters would have been better off averaging all the model data together, even the models that don’t have a stellar record. The Euro is king, but it’s not so good that we should ignore all other forecasts.
  3. There’s nothing wrong with changing a forecast. When the non-Euro models (except for the NAM) stayed consistent in showing about an inch or less of liquid precipitation (or 10 inches of snow on a 10-to-1 ratio) reaching New York and the Euro backed off its biggest predictions Monday afternoon, it was probably time for forecasters to change their stance. They waited too long; I’m not sure why.

Meteorology deals in probabilities and uncertainty. Models, and the forecasters who use those models, aren’t going to be perfect. In this case, there was a big storm. It just so happened to be confined to eastern Long Island and southern New England. But that’ll do little to satisfy New Yorkers who expected a historic blizzard.

Advertisements

Note 1: Reposted from WUWT.

Note 2: “IPCC” is the International Panel on Climate Change, a U.N. body driving the fearmongering associated with “Catastrophic Climate Change” due to man’s activities, especially the myths concerning carbon dioxide emissions.

IPCC Scientists Knew Data and Science Inadequacies Contradicted Certainties Presented to Media, Public and Politicians, But Remained Silent

By Dr. Tim Ball, March 21, 2014.

I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. Arthur Conan Doyle. (Sherlock Holmes)

There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain. A.N.Whitehead

The recent article by Nancy Green at WUWT is an interesting esoteric discussion about models. Realities about climate models are much more prosaic. They don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money.

Inadequacies are confirmed by the complete failure of all forecasts, predictions, projections, prognostications, or whatever they call them. It is one thing to waste time and money playing with climate models in a laboratory, where they don’t meet minimum scientific standards, it is another to use their results as the basis for public policies where the economic and social ramifications are devastating. Equally disturbing and unconscionable is the silence of scientists involved in the IPCC who know the vast difference between the scientific limitations and uncertainties and the certainties produced in the Summary for Policymakers (SPM).

IPCC scientists knew of the inadequacies from the start. Kevin Trenberth’s response to a report on inadequacies of weather data by the US National Research Council said

“It’s very clear we do not have a climate observing system…” “This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.”

This was in response to the February 3, 1999 Report that said,

“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

Remember this is 11 years after Hansen’s comments of certainty to the Senate and five years after the 1995 IPCC Report. It is worse now with fewer weather stations and less data than in 1990.

Before leaked emails exposed its climate science manipulations, the Climatic Research Unit (CRU) issued a statement that said,

“GCMs are complex, three dimensional computer-based models of the atmospheric circulation. Uncertainties in our understanding of climate processes, the natural variability of the climate, and limitations of the GCMs mean that their results are not definite predictions of climate.”

Phil Jones, Director of the CRU at the time of the leaked emails and former director Tom Wigley, both IPCC members, said,

“Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.“

Stephen Schneider, prominent part of the IPCC from the start said,

“Uncertainty about feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.”

Schneider also set the tone and raised eyebrows when he said in Discover magazine.

Scientists need to get some broader based support, to capture the public’s imagination…that, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified dramatic statements, and make little mention of any doubts we may have…each of us has to decide what the right balance is between being effective and being honest.

The IPCC achieved his objective with devastating effect, because they chose effective over honest.

A major piece of evidence is the disparity between the Working Group I (WGI) (Physical Science Basis) Report, particularly the Chapter on computer models and the claims in the Summary for Policymakers (SPM) Report. Why did the scientists who participated in the WGI Report remain so silent about the disparity?

Here is the IPCC procedure:

Changes (other than grammatical or minor editorial changes) made after acceptance by the Working Group or the Panel shall be those necessary to ensure consistency with the Summary for Policymakers (SPM) or the Overview Chapter.

The Summary is written then the WGI is adjusted. It is like an executive publishing findings then asking employees to produce material to justify them. The purpose is to present a completely different reality to the press and the public.

This is to ensure people, especially the media, read the SPM first. It is released well before the WGI Report, which they knew few would ever read. There is only one explanation for producing it first. David Wojick, an IPCC expert reviewer, explained:

Glaring omissions are only glaring to experts, so the “policymakers”—including the press and the public—who read the SPM will not realize they are being told only one side of a story. But the scientists who drafted the SPM know the truth, as revealed by the sometimes artful way they conceal it

What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.

The Physical Basis of the Models

Here is a simple diagram of how the atmosphere is divided to create climate models.

clip_image002

Figure 1: Schematic of General Circulation Model (GCM).

The surface is covered with a grid and the atmosphere divided into layers. Computer models vary in the size of the grids and the number of layers. They claim a smaller grid provides better results. It doesn’t! If there is no data a finer grid adds nothing. The model needs more real data for each cube and it simply isn’t available. There are no weather stations for at least 70% of the surface and virtually no data above the surface. There are few records of any length anywhere; the models are built on virtually nothing. The grid is so large and crude they can’t include major weather features like thunderstorms, tornados, or even small cyclonic storm systems. The IPCC 2007 Report notes,

Despite the many improvements, numerous issues remain. Many of the important processes that determine a model’s response to changes in radiative forcing are not resolved by the model’s grid. Instead, sub-grid scale parameterizations are used to parametrize the unresolved processes, such as cloud formation and the mixing due to oceanic eddies.

O’Keefe and Kueter explain how a model works: “

The climate model is run, using standard numerical modeling techniques, by calculating the changes indicated by the model’s equations over a short increment of time—20 minutes in the most advanced GCMs—for one cell, then using the output of that cell as inputs for its neighboring cells. The process is repeated until the change in each cell around the globe has been calculated.”

Interconnections mean errors are spread and amplified. Imagine the number of calculations necessary that even at computer speed take a long time. The run time is a major limitation.

All of this takes huge amounts of computer capacity; running a full-scale GCM for a 100-year projection of future climate requires many months of time on the most advanced supercomputer. As a result, very few full-scale GCM projections are made.

A comment at Steve McIntyre’s site, Climateaudit, illustrates the problem.

Caspar Ammann said that GCMs (General Circulation Models) took about 1 day of machine time to cover 25 years. On this basis, it is obviously impossible to model the Pliocene-Pleistocene transition (say the last 2 million years) using a GCM as this would take about 219 years of computer time.

So you can only run the models if you reduce the number of variables. O’Keefe and Kueter explain.

As a result, very few full-scale GCM projections are made. Modelers have developed a variety of short cut techniques to allow them to generate more results. Since the accuracy of full GCM runs is unknown, it is not possible to estimate what impact the use of these short cuts has on the quality of model outputs.

Omission of variables allows short runs, but allows manipulation and moves the model further from reality. Which variables do you include? For the IPCC only those that create the results they want. Besides, because climate is constantly and widely varying so a variable may become more or less important over time as thresholds change.

By selectively leaving out important components of the climate system, likelihood of a human signal being the cause of change is guaranteed. As William Kinninmonth, meteorologist and former head of Australia’s National Climate Centre explains,

… current climate modeling is essentially to answer one question: how will increased atmospheric concentrations of CO2 (generated from human activity) change earth’s temperature and other climatological statistics? Neither cosmology nor vulcanology enter the equations. It should also be noted that observations related to sub-surface ocean circulation (oceanology), the prime source of internal variability, have only recently commenced on a consistent global scale. The bottom line is that IPCC’s view of climate has been through a narrow prism. It is heroic to assume that such a view is sufficient basis on which to predict future ‘climate’.

Static Climate Models In A Virtually Unknown Dynamic Atmosphere.

Heroic is polite. I suggest it is deliberately wrong. Lack of data alone justifies that position, lack of knowledge about atmospheric circulation is another. The atmosphere is three-dimensional and dynamic, so to build a computer model that even approximates reality requires far more data than exists, much greater understanding of an extremely turbulent and complex system, and computer capacity that is unavailable for the foreseeable future. As the IPCC note,

Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity. Poor model skill in simulating present climate could indicate that certain physical or dynamical processes have been misrepresented.

The history of understanding the atmosphere leaps 2000 years from Aristotle who knew there were three distinct climate zones to George Hadley in the 18th century. The word climate comes from the Greek word klima for slope referring to the angle of the sun and the climate zones it creates. Aristotle’s views dominated western science until the 16th century, but it wasn’t until the 18th century wider, but still narrow, understanding began.

In 1735 George Hadley used the wind patterns, recorded by English sailing ships, to create the first 3D diagram of circulation.

clip_image004

Figure 1. Hadley Cell (Northern Hemisphere)

Restricted only to the tropics, it became known as the Hadley Cell. Sadly, today we know little more than Hadley although Willis Eschenbach has worked hard to identify its role in transfer of heat energy. The Intergovernmental Panel on Climate Change (IPCC) illustrates the point in Chapter 8 of the 2007 Report.

The spatial resolution of the coupled ocean-atmosphere models used in the IPCC assessment is generally not high enough to resolve tropical cyclones, and especially to simulate their intensity.

The problem for climate science and modelers is the Earth is spherical and it rotates. Rotation around the sun creates the seasons, but the rotation around the axes creates even bigger geophysical dynamic problems. Because of it, a simple single cell system (Figure 2) with heated air rising at the Equator moving to the Poles, sinking and returning to the Equator, breaks up. The Coriolis Effect is the single biggest influence on the atmosphere caused by rotation. It dictates that anything moving across the surface appears to be deflected to the right in the Northern Hemisphere and to the left in the Southern Hemisphere. It appears that a force is pushing from the side so people incorrectly refer to the Coriolis Force. There is no Force.

clip_image006

Figure 2: A Simple Single Cell.

Figure 3 shows a more recent attempt to approximate what is going on.

clip_image008

Figure 3: A more recent model of a cross-section through the Northern Hemisphere.

Now it is the Indirect Ferrell Cell. Notice the discontinuities in the Tropopause and the Stratospheric – Tropospheric Mixing. This is important, because the IPCC doesn’t deal with the critical interface between the stratosphere and a major mechanism in the upper Troposphere in their models.

Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.

This is just one example of model inadequacies provided by the IPCC.

What the IPCC Working Group I, (The Physical Science Basis Report) Says About the Models.

The following quotes (Italic and inset) are under their original headlines from Chapter 8 of the 2007 IPCC AR4 Report. Comments are in regular type.

8.2 Advances in Modelling

There is currently no consensus on the optimal way to divide computer resources among finer numerical grids, which allow for better simulations; greater numbers of ensemble members, which allow for better statistical estimates of uncertainty; and inclusion of a more complete set of processes (e.g., carbon feedbacks, atmospheric chemistry interactions).

Most don’t understand models or the mathematics on which they are built, a fact exploited by promoters of human caused climate change. They are also a major part of the IPCC work not yet investigated by people who work outside climate science. Whenever outsiders investigate, as with statistics and the hockey stick, the gross and inappropriate misuses are exposed. The Wegman Report investigated the Hockey Stick fiasco, but also concluded,

We believe that there has not been a serious investigation to model the underlying process structures nor to model the present instrumented temperature record with sophisticated process models.

FAQ 8.1: How Reliable Are the Models Used to Make Projections of Future Climate Change?

Nevertheless, models still show significant errors. Although these are generally greater at smaller scales, important large-scale problems also remain. For example, deficiencies remain in the simulation of tropical precipitation, the El Niño- Southern Oscillation and the Madden-Julian Oscillation (an observed variation in tropical winds and rainfall with a time scale of 30 to 90 days).

Models continue to have significant limitations, such as in their representation of clouds, which lead to uncertainties in the magnitude and timing, as well as regional details, of predicted climate change. Nevertheless, over several decades of model development, they have consistently provided a robust and unambiguous picture of significant climate warming in response to increasing greenhouse gases.

Of course they do, because that is how they are programmed.

8.2.1.1 Numerics

In this report, various models use spectral, semi-Lagrangian, and Eulerian finite-volume and finite-difference advection schemes, although there is still no consensus on which type of scheme is best.

But how different are the results and why don’t they know which is best?

8.2.1.3 Parameterizations

The climate system includes a variety of physical processes, such as cloud processes, radiative processes and boundary-layer processes, which interact with each other on many temporal and spatial scales. Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parametrized. The differences between parametrizations are an important reason why climate model results differ.

How can parameterizations vary? The variance is evidence they are simply guessing at the conditions in each grid and likely choosing the one that accentuates their bias.

8.2.2.1 Numerics

Issues remain over the proper treatment of thermobaricity (nonlinear relationship of temperature, salinity and pressure to density), which means that in some isopycnic coordinate models the relative densities of, for example, Mediterranean and Antarctic Bottom Water masses are distorted. The merits of these vertical coordinate systems are still being established.

8.2.3.2 Soil Moisture Feedbacks in Climate Models

Since the TAR, there have been few assessments of the capacity of climate models to simulate observed soil moisture. Despite the tremendous effort to collect and homogenise soil moisture measurements at global scales (Robock et al., 2000), discrepancies between large-scale estimates of observed soil moisture remain. The challenge of modelling soil moisture, which naturally varies at small scales, linked to landscape characteristics, soil processes, groundwater recharge, vegetation type, etc., within climate models in a way that facilitates comparison with observed data is considerable. It is not clear how to compare climate-model simulated soil moisture with point-based or remotely sensed soil moisture. This makes assessing how well climate models simulate soil moisture, or the change in soil moisture, difficult.

Evaporation is a major transfer of long-wave energy from the surface to the atmosphere. This inadequacy alone likely more than equals the change created by human addition of CO2.

8.2.4.1 Terrestrial Cryosphere

Glaciers and ice caps, due to their relatively small scales and low likelihood of significant climate feedback at large scales, are not currently included interactively in any AOGCMs.

How big does an ice cap have to be to influence the parameterization in a grid? Greenland is an ice cap.

8.2.5 Aerosol Modelling and Atmospheric Chemistry

The global Aerosol Model Intercomparison project, AeroCom, has also been initiated in order to improve understanding of uncertainties of model estimates, and to reduce them (Kinne et al., 2003).

Interactive atmospheric chemistry components are not generally included in the models used in this report.

8.3 Evaluation of Contemporary Climate as Simulated by Coupled Global Models

Due to nonlinearities in the processes governing climate, the climate system response to perturbations depends to some extent on its basic state (Spelman and Manabe, 1984). Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity. Poor model skill in simulating present climate could indicate that certain physical or dynamical processes have been misrepresented.

They don’t even know which ones are misrepresented?

8.3.1.2 Moisture and Precipitation

For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.

Precipitation forecasts (projections?) are worse than their temperature projections (forecasts).

8.3.1.3 Extratropical Storms

Our assessment is that although problems remain, climate models are improving in their simulation of extratropical cyclones.

This is their self-serving assessment. How much are they improving and from what baseline?

8.3.2 Ocean

Comparisons of the type performed here need to be made with an appreciation of the uncertainties in the historical estimates of radiative forcing and various sampling issues in the observations.

8.3.2.1 Simulation of Mean Temperature and Salinity Structure

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

8.3.2.2 Simulation of Circulation Features Important for Climate Response

The MOC (meridional overturning circulation) is an important component of present-day climate and many models indicate that it will change in the future (Chapter 10). Unfortunately, many aspects of this circulation are not well observed.

8.3.2.3 Summary of Oceanic Component Simulation

The temperature and salinity errors in the thermocline, while still large, have been reduced in many models.

How much reduction and why in only some models?

8.3.3 Sea Ice

The magnitude and spatial distribution of the high-latitude climate changes can be strongly affected by sea ice characteristics, but evaluation of sea ice in models is hampered by insufficient observations of some key variables (e.g., ice thickness) (see Section 4.4). Even when sea ice errors can be quantified, it is difficult to isolate their causes, which might arise from deficiencies in the representation of sea ice itself, but could also be due to flawed simulation of the atmospheric and oceanic fields at high latitudes that drive ice movement (see Sections 8.3.1, 8.3.2 and 11.3.8).

8.3.4 Land Surface

Vast areas of the land surface have little or no current data and even less historic data. These include 19 percent deserts, 20 percent mountains, 20 percent grasslands, 33 percent combined tropical and boreal forests and almost the entire Arctic and Antarctic regions.

8.3.4.1 Snow Cover

Evaluation of the land surface component in coupled models is severely limited by the lack of suitable observations.

Why? In 1971-2 George Kukla was producing estimates of varying snow cover as a factor in climate change. Satellite data is readily available for simple assessment of the changes through time.

8.3.4.2 Land Hydrology

The evaluation of the hydrological component of climate models has mainly been conducted uncoupled from AOGCMs (Bowling et al., 2003; Nijssen et al., 2003; Boone et al., 2004). This is due in part to the difficulties of evaluating runoff simulations across a range of climate models due to variations in rainfall, snowmelt and net radiation.

8.3.4.4 Carbon

Despite considerable effort since the TAR, uncertainties remain in the representation of solar radiation in climate models (Potter and Cess, 2004).

8.4.5 Atmospheric Regimes and Blocking

Blocking events are an important class of sectoral weather regimes (see Chapter 3), associated with local reversals of the mid-latitude westerlies.

There is also evidence of connections between North and South Pacific blocking and ENSO variability (e.g., Renwick, 1998; Chen and Yoon, 2002), and between North Atlantic blocks and sudden stratospheric warmings (e.g., Kodera and Chiba, 1995; Monahan et al., 2003) but these connections have not been systematically explored in AOGCMs.

Blocking was a significant phenomenon in the weather patterns as the Circumpolar flow changed from Zonal to Meridional in 2013-14.

8.4.6 Atlantic Multi-decadal Variability

The mechanisms, however, that control the variations in the MOC are fairly different across the ensemble of AOGCMs. In most AOGCMs, the variability can be understood as a damped oceanic eigenmode that is stochastically excited by the atmosphere. In a few other AOGCMs, however, coupled interactions between the ocean and the atmosphere appear to be more important.

Translation; We don’t know.

8.4.7 El Niño-Southern Oscillation

Despite this progress, serious systematic errors in both the simulated mean climate and the natural variability persist. For example, the so-called double ITCZproblem noted by Mechoso et al. (1995; see Section 8.3.1) remains a major source of error in simulating the annual cycle in the tropics in most AOGCMs, which ultimately affects the fidelity of the simulated ENSO.

8.4.8 Madden-Julian Oscillation

The MJO (Madden and Julian, 1971) refers to the dominant mode of intra-seasonal variability in the tropical troposphere. Thus, while a model may simulate some gross characteristics of the MJO, the simulation may be deemed unsuccessful when the detailed structure of the surface fluxes is examined (e.g., Hendon, 2000).

8.4.9 Quasi-Biennial Oscillation

The Quasi-Biennial Oscillation (QBO; see Chapter 3) is a quasi-periodic wave-driven zonal mean wind reversal that dominates the low-frequency variability of the lower equatorial stratosphere (3 to 100 hPa) and affects a variety of extratropical phenomena including the strength and stability of the winter polar vortex (e.g., Baldwin et al., 2001).. Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.

8.4.10 Monsoon Variability

In short, most AOGCMs do not simulate the spatial or intra-seasonal variation of monsoon precipitation accurately.

Monsoons are defined by extreme seasonality of rainfall. They occur in many regions around the word, though most only associate them with Southern Asia. It is not clear what the IPCC mean. Regardless, these are massive systems of energy transfer from the region of energy surplus to the deficit region.

8.4.11 Shorter-Term Predictions Using Climate Models

This suggests that ongoing improvements in model formulation driven primarily by the needs of weather forecasting may lead also to more reliable climate predictions.

This appears to contradict the claim that weather and climate forecasts are different. As Norm Kalmonavitch notes,

The GCM models referred to as climate models are actually weather models only capable of predicting weather about two weeks into the future and as we are aware from our weather forecasts temperature predictions

In 2008 Tim Palmer, a leading climate modeller at the European Centre for Medium-Range Weather Forecasts in Reading England said in the New Scientist.

I dont want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.

8.5.2 Extreme Precipitation

Sun et al. (2006) investigated the intensity of daily precipitation simulated by 18 AOGCMs, including several used in this report. They found that most of the models produce light precipitation (<10 mm day1) more often than observed, too few heavy precipitation events and too little precipitation in heavy events (>10 mm day1). The errors tend to cancel, so that the seasonal mean precipitation is fairly realistic (see Section 8.3).

Incredible, the errors cancel and since the results appear to match reality they must be correctly derived.

8.5.3 Tropical Cyclones

The spatial resolution of the coupled ocean-atmosphere models used in the IPCC assessment is generally not high enough to resolve tropical cyclones, and especially to simulate their intensity.

8.6.2 Interpreting the Range of Climate Sensitivity Estimates Among General Circulation Models

The climate sensitivity depends on the type of forcing agents applied to the climate system and on their geographical and vertical distributions (Allen and Ingram, 2002; Sausen et al., 2002; Joshi et al., 2003). As it is influenced by the nature and the magnitude of the feedbacks at work in the climate response, it also depends on the mean climate state (Boer and Yu, 2003). Some differences in climate sensitivity will also result simply from differences in the particular radiative forcing calculated by different radiation codes (see Sections 10.2.1 and 8.6.2.3).

Climate sensitivity has consistently declined and did so further in IPCC AR5. In fact, in the SPM for AR5 the sensitivity declined in the few weeks from the first draft to the final report.

8.6.2.2 Why Have the Model Estimates Changed Since the TAR?

The current generation of GCMs[5] covers a range of equilibrium climate sensitivity from 2.1°C to 4.4°C (with a mean value of 3.2°C; see Table 8.2 and Box 10.2), which is quite similar to the TAR. Yet most climate models have undergone substantial developments since the TAR (probably more than between the Second Assessment Report and the TAR) that generally involve improved parametrizations of specific processes such as clouds, boundary layer or convection (see Section 8.2). In some cases, developments have also concerned numerics, dynamical cores or the coupling to new components (ocean, carbon cycle, etc.). Developing new versions of a model to improve the physical basis of parametrizations or the simulation of the current climate is at the heart of modelling group activities. The rationale for these changes is generally based upon a combination of process-level tests against observations or against cloud-resolving or large-eddy simulation models (see Section 8.2), and on the overall quality of the model simulation (see Sections 8.3 and 8.4). These developments can, and do, affect the climate sensitivity of models.

All this says is that climate models are a work in progress. However, it also acknowledges that they can only hope to improve parameterization. In reality they need more and better data, but that is not possible for current or historic data. Even if they started an adequate data collection system today it would be thirty years before it would be statistically significant.

8.6.2.3 What Explains the Current Spread in Models’ Climate Sensitivity Estimates?

The large spread in cloud radiative feedbacks leads to the conclusion that differences in cloud response are the primary source of inter-model differences in climate sensitivity (see discussion in Section 8.6.3.2.2). However, the contributions of water vapour/lapse rate and surface albedo feedbacks to sensitivity spread are non-negligible, particularly since their impact is reinforced by the mean model cloud feedback being positive and quite strong.

What does “non-negligible “ mean? Is it a double negative? Apparently. Why don’t they use the term significant? They assume their inability to produce accurate results is because of clouds and water vapor. As this review shows there are countless other factors and especially those they ignore like the Sun. The 2001 TAR Report included a table of the forcings with a column labeled Level of Scientific Understanding (LOSU). Of the nine forcings only two have a ”high” rating, although that is their assessment, one is medium and the other six are “low”. The only difference in the 2007 FAR Report is the LOSU column is gone.

8.6.3.2 Clouds

Despite some advances in the understanding of the physical processes that control the cloud response to climate change and in the evaluation of some components of cloud feedbacks in current models, it is not yet possible to assess which of the model estimates of cloud feedback is the most reliable.

The cloud problem is far more complicated than this summary implies. For example, clouds function differently depending on type, thickness, percentage of water vapor, water droplets, ice crystals or snowflakes and altitude.

8.6.3.3 Cryosphere Feedbacks

A number of processes, other than surface albedo feedback, have been shown to also contribute to the polar amplification of warming in models (Alexeev, 2003, 2005; Holland and Bitz, 2003; Vavrus, 2004; Cai, 2005; Winton, 2006b). An important one is additional poleward energy transport, but contributions from local high-latitude water vapour, cloud and temperature feedbacks have also been found. The processes and their interactions are complex, however, with substantial variation between models (Winton, 2006b), and their relative importance contributing to or dampening high-latitude amplification has not yet been properly resolved.

You can’t know how much energy is transported to polar regions if you can’t determine how much is moving out of the tropics. The complete lack of data for the entire Arctic Ocean and most of the surrounding land is a major limitation.

8.6.4 How to Assess Our Relative Confidence in Feedback to controls Simulated by Different Models?

A number of diagnostic tests have been proposed since the TAR (see Section 8.6.3), but few of them have been applied to a majority of the models currently in use. Moreover, it is not yet clear which tests are critical for constraining future projections. Consequently, a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed.

The IPCC chapter on climate models appears to justify use of the models by saying they show an increase in temperature when CO2 is increased. Of course they do, that is how they’re programmed. Almost every individual component of the model has, by their admission, problems ranging from lack of data, lack of understanding of the mechanisms, and important ones are omitted because of inadequate computer capacity or priorities. The only possible conclusion is that the models were designed to prove the political position that human CO2 was a problem.

Scientists involved with producing this result knew the limitations were so severe they precluded the possibility of proving the result. This is clearly set out in the their earlier comments and the IPCC Science Report they produced. They remained silent when the SPM claimed, with high certainty, they knew what was going on with the climate. They had to know this was wrong. They may not have known about the political agenda when they were inveigled into participating, but they had to know when the 1995 SPM was published because Benjamin Santer exploited the SPM bias by rewriting Chapter 8 of the 1995 Report in contradiction to what the members of his chapter team had agreed. The gap widened in subsequent SPMs but they remained silent and therefore complicit.

EWR Note: The following post is crossposted from Watts Up With That, and is available on WUWT at the link in the title below. EWR encourages you to go to WUWT and review also the commentary discussion of this post. Most of the discussion is about minor details, not about what the post says and what the post illustrates: The scare regime of “Anthropogenic Global Warming” (or “AGW”, “CAGW” or “Climate Change” as its now called) is OVER. Solid data now reaching out more than 17 years shows what most real scientists have known for some time: that there is no coupled effect between levels of CO2 (carbon dioxide) and global temperatures. Yes, the physics of CO2 in pure gaseous state provides for a warming contribution to the gas, but in the context of the planet, at <.04% of the earth's atmosphere, is not, and has never been, a determiner of global climate. Man's contribution to the <.04%, and its effect, even doubled, is insignificant.

The AGW-Climate Change meme is, and has always been, a political construct of factions of the UN and related interests. This isn't conspiracy theory, is demonstrable fact supported by a wide variety of documents, many from the UN itself. I will leave the reader to conduct their own searchs on this aspect – there is plenty to read. As the post below outlines, the scientific evidence is clear: there is no "Climate Change" event attributable to the impact of CO2 or man's contribution to it. The earth is going through an interglacial period that is not as yet understood. There appears to be correlations with solar activity, and to a variety of heat exchange mechanisms with the major heat sinks such as the oceans and the atmosphere, but the CO2 balance and man's contribution are not part of the equations. There is no basis for extensive taxation and mitigation strategies based on falsely projected "climate change " events. It is ALL politcal theatre, and the question needs to be asked: who profits?

This is a long post. Some of the graphs are wider then this blog format – click on the graphic to see the whole graphic.

A Big Picture Look At “Earth’s Temperature” – Santer 17 Update

By WUWT regular “Just The Facts”

NOAA’s State of the Climate In 2008 report found that:

The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.

In 2010 Phil Jones was asked by the BBC;

“Do you agree that from 1995 to the present there has been no statistically-significant global warming?”

Phil Jones replied:

Yes, but only just.

In 2011, the paper “Separating signal and noise in atmospheric temperature changes: The importance of timescale” by Santer et al. moved the goal posts and found that:

Because of the pronounced effect of interannual noise on decadal trends, a multi-model ensemble of anthropogenically-forced simulations displays many 10-year periods with little warming. A single decade of observational TLT data is therefore inadequate for identifying a slowly evolving anthropogenic warming signal. Our results show that temperature records of at least 17 years in length are required for identifying human effects on global-mean tropospheric temperature.

In October 2013, the Remote Sensing Systems (RSS) satellite temperature data set reached a period of 204 months/17 years for which the slope is = -0.000122111 per year. For those not familiar, the RSS satellite temperature data set is similar to the University of Alabama – Huntsville (UAH) dataset that John Christy and Roy Spencer manage. Information about RSS can be found at here and the data set can be found here.

In November 2013, Dr. Robert G. Brown, Physics Department of Duke University wrote on WUWT:

This (17 years) is a non-event, just as 15 and 16 years were non-events. Non-events do not make headlines. Other non-events of the year are one of the fewest numbers of tornadoes* (especially when corrected for under-reporting in the radar-free past) in at least the recent past (if not the remote past), the lowest number of Atlantic hurricanes* since I was 2 years old (I’m 58), the continuation of the longest stretch in recorded history without a category 3 or higher hurricane making landfall in the US (in fact, I don’t recall there being a category 3 hurricane in the North Atlantic this year, although one of the ones that spun out far from land might have gotten there for a few hours).        * Links added subsequently

While I must disagree with Dr. Robert G. Brown as to what one can and can’t be make into a headline, I do otherwise agree wholeheartedly. Unfortunately, with mainstream media outlets like PBS are running erroneous headlines like, “UN Panel: ‘Extremely Likely’ Earth’s Rapid Warming Is Caused by Humans” we are stuck reporting on average climate data. Amusingly, it has proven a quite effective method of informing the public and disprove erroneous alarmist claims and headlines, as Dr. Brown’s comment above attests.

For those not too familiar with the “Pause” in Earth’s warming, recommended reading includes: “Over the past 15 years air temperatures at the Earth’s surface have been flat while greenhouse-gas emissions have continued to soar.” The Economist “Global warming stopped 16 years ago, reveals Met Office report quietly released… and here is the chart to prove it.” Daily Mail “Twenty-year hiatus in rising temperatures has climate scientists puzzled.” The Australian “Has the rise in temperatures ‘paused’?” Guardian “On Tuesday, news finally broke of a revised Met Office ‘decadal forecast’, which not only acknowledges the pause, but predicts it will continue at least until 2017.” Daily Mail “RSS global satellite temperatures confirm hiatus of global warming, while the general public and mainstream press are now recognizing the AWOL truth that skeptics long ago identified…global temperatures are trending towards cooling, not accelerating higher” C3 Headlines

In terms of exactly how long the “Pause” has lasted, it depends on the data set and what it is being measured, e.g. in Werner Brozek’s recent article Statistical Significances – How Long Is “The Pause”? he showed that;

1. For GISS, the slope is flat since September 1, 2001 or 12 years, 1 month. (goes to September 30, 2013)
2. For Hadcrut3, the slope is flat since May 1997 or 16 years, 5 months. (goes to September)
3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 12 years, 10 months. (goes to September)
4. For Hadcrut4, the slope is flat since December 2000 or 12 years, 10 months. (goes to September)
5. For Hadsst3, the slope is flat since November 2000 or 12 years, 11 months. (goes to September)
6. For UAH, the slope is flat since January 2005 or 8 years, 9 months. (goes to September using version 5.5)
7. For RSS, the slope is flat since November 1996 or 17 years (goes to October)

Here’s what that looks like graphically;

WoodForTrees.org – Paul Clark – Click the pic to view at source

However, to really see the big picture on “Earth’s Temperature” we must take into account many more measurements than just Surface and Tropospheric Temperatures. As such, the following is an overview of many of them. NASA’s Earth Observatory claims that;

“Global warming is the unusually rapid increase in Earth’s average surface temperature over the past century primarily due to the greenhouse gases released by people burning fossil fuels.”

so let us start there…

Global Surface Temperatures:

NASA’s Goddard Institute for Space Studies (GISS) Global Monthly Mean Surface Temperature Anomaly – 1996 to Present:

National Aeronautics and Space Administration (NASA) Goddard Institute for Space Studies (GISS) – Click the pic to view at source

NOAA’s – National Climate Data Center – Annual Global Land and Ocean Temperature Anomalies:

NOAA – National Climate Data Center – Click the pic to view at source

UK Met Office’s – Hadley Center – Climate Research Unit (CRU) Annual Global Average Land and Ocean Temperature Anomaly;

Met Office – Hadley Center – Click the pic to view at source

the UK Met Office – Hadley Center – Climate Research Unit (CRU) Monthly Global Average Land Temperature;

Met Office – Hadley Center – Click the pic to view at source

and HadCRUT4 Global, Northern and Southern Hemispheric Temperature Anomalies:

University of East Anglia (UEA) – Climatic Research Unit (CRU) – Click the pic to view at source

The Pause appears to apparent in Earth’s Land and Surface Temperature record. It is important to note that the reason that the IPCC claims to be;

“95% certain that humans are the “dominant cause” of global warming since the 1950sBBC

is because prior to 1950 Anthropogenic CO2 Emissions from Fossil-Fuels were insufficient to have a significant influence on “Earth’s Temperature”, i.e. Anthropogenic CO2 Emissions from Fossil-Fuels;

Carbon Dioxide Information Analysis Center – Click the pic to view at source

and Cumulative Anthropogenic CO2 Emissions from Fossil-Fuels:

Carbon Dioxide Information Analysis Center – Click the pic to view at source

In May 2013, the Economist noted that;

The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010. That is about a quarter of all the CO₂ put there by humanity since 1750. And yet, as James Hansen, the head of NASA’s Goddard Institute for Space Studies, observes, “the five-year mean global temperature has been flat for a decade.”

Additionally, surface temperature records are burdened with issues of questionable siting, changes in siting, changes in equipment, changes in the number of measurement locations, modeling to fill in gaps in measurement locations, corrections to account for missing, erroneous or biased measurements, land use changes, anthropogenic waste heat and the urban heat island effect.  Thus to see the Big Picture of “Earth’s Temperature”, it also helps to look up.

Atmospheric Temperatures:

Since 1979 Earth’s “temperature” has also been measured via satellite. “The temperature measurements from space are verified by two direct and independent methods. The first involves actual in-situ measurements of the lower atmosphere made by balloon-borne observations around the world. The second uses intercalibration and comparison among identical experiments on different orbiting platforms. The result is that the satellite temperature measurements are accurate to within three one-hundredths of a degree Centigrade (0.03 C) when compared to ground-launched balloons taking measurements of the same region of the atmosphere at the same time.” NASA

Here is RSS Global Temperature Lower Troposphere (TLT) – Brightness Temperature Anomaly- 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

and this is the University of Alabama – Hunstville (UAH) Global Lower Atmosphere Temperature Anomalies – 1979 to Present:

University of Alabama – Huntsville (UAH) – Dr. Roy Spencer – Click the pic to view at source

Note: Per John Christy, RSS and UAH anomalies are not comparable because they use different base periods, i.e., “RSS only uses 1979-1998 (20 years) while UAH uses the WMO standard of 1981-2010.”

The March UAH Lower Atmosphere Temperature Anomaly was .29 degrees C above the 30 year average and RSS Global Global Lower Troposphere shows a .127 degrees C increase per decade.

When we look at Earth’s “canaries”, i.e. RSS Northern Polar Temperature Lower Troposphere (TLT) Brightness Temperature Anomaly;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

appears to have Paused for the last 18 years and RSS Southern Polar Temperature Lower Troposphere (TLT) Brightness Temperature Anomaly;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

looks like it has been on Pause for its entire record.

To this point we’ve only addressed the Lower Troposphere Temperatures, the following Temperature Anomaly plots from RSS will increase in altitude as is illustrated here:

Here is RSS Temperature Middle Troposphere (TMT)- Brightness Temperature Anomaly- 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

According to Remote Sensing Systems, “For Channel (TLT) (Lower Troposphere) and Channel (TMT) (Middle Troposphere), the anomaly time series is dominated by ENSO events and slow tropospheric warming. The three primary El Niños during the past 20 years are clearly evident as peaks in the time series occurring during 1982-83, 1987-88, and 1997-98, with the most recent one being the largest.” RSS

Middle Tropospheric temperatures appear to show slow warming overlaid with the El Niño/La Niña Southern Oscillation (ENSO) cycle, including several comparatively large El Niño events. Middle Tropospheric temperatures appear to entered The Pause with the large El Niño in 1998.

Moving higher in the atmosphere, RSS Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been in The Pause since records began in 1987, with a trend of just -.004 K/C per decade.

The 1997-98 and 2009 – 10 El Niño events are still readily apparent in the Troposphere / Stratosphere plot above, as is a spike from the 1991 eruption of Mt. Pinatubo. Note that the effect of Mt. Pinatubo is the opposite in the Lower and Middle Troposphere versus the Troposphere / Stratosphere (TTS), i.e. “Large volcanic eruptions inject sulfur gases into the stratosphere; the gases convert into submicron particles (aerosol) with an e-folding time scale of about 1 year. The climate response to large eruptions (in historical times) lasts for several (2-3) years. The aerosol cloud causes cooling at the Earth’s surface, warming in stratosphere.”
Ellen Thomas, PHD Wesleyan University

It is interesting that, incorporating the impact of three significant surface driven warming events, Troposphere / Stratosphere Temperatures (TTS) have been quite stable, however there is a bit of regional variation here, e.g.:

RSS Northern Hemisphere Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been increasing by .047 K/C per decade, whereas the RSS Southern Hemisphere Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been decreasing by -.039 K/C per decade.

Moving higher still in the atmosphere, the RSS Temperature Lower Stratosphere (TLS) – Brightness Temperature Anomaly – 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

“is dominated by stratospheric cooling, punctuated by dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).” RSS

The eruptions of El Chichon and Mt Pinatubo are readily apparent in the Apparent Atmospheric Transmission of Solar Radiation at Mauna Loa, Hawaii:

National Oceanic and Atmospheric Administration (NOAA) – Earth System Research Laboratory (ESRL) – Click the pic to view at source

“The stratosphere” … “in contrast to the troposphere, is heated, as the result of near infrared absorption of solar energy at the top of the aerosol cloud, and increased infra-red absorption of long-wave radiation from the Earth’s surface.”

“The stratospheric warming in the region of the stratospheric cloud increases the latitudinal temperature gradient after an eruption at low latitudes, disturbing the stratospheric-troposphere circulation, increasing the difference in height of the troposphere between high and low latitudes, and increasing the strength of the jet stream (polar vortex, especially in the northern hemisphere). This leads to warming during the northern hemisphere winter following a tropical eruption, and this warming effect tends to be larger than the cooling effect described above.” Ellen Thomas, PHD Wesleyan University

The Lower Stratosphere experienced “dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).” RSS “The long-term, global-mean cooling of the lower stratosphere stems from two downward steps in temperature, both of which are coincident with the cessation of transient warming after the volcanic eruptions of El Chichon and Mt. Pinatubo.” … “Here we provide observational analyses that yield new insight into three key aspects of recent stratospheric climate change. First, we provide evidence that the unusual step-like behavior of global-mean stratospheric temperatures is dependent not only upon the trend but also on the temporal variability in global-mean ozone immediately following volcanic eruptions. Second, we argue that the warming/cooling pattern in global-mean temperatures following major volcanic eruptions is consistent with the competing radiative and chemical effects of volcanic eruptions on stratospheric temperature and ozone. Third, we reveal the contrasting latitudinal structures of recent stratospheric temperature and ozone trends are consistent with large-scale increases in the stratospheric overturning Brewer-Dobson circulation” David W. J. Thompson Colorado State University

Above the Stratosphere we have the Mesosphere and Thermosphere, neither of which have I identified current temperature time series for, but of note is that on “July 15, 2010″ “A Puzzling Collapse of Earth’s Upper Atmosphere” occurred when “high above Earth’s surface where the atmosphere meets space, a rarefied layer of gas called “the thermosphere” recently collapsed and now is rebounding again.”

“This is the biggest contraction of the thermosphere in at least 43 years,” says John Emmert of the Naval Research Lab, lead author of a paper announcing the finding in the June 19th issue of the Geophysical Research Letters (GRL). “It’s a Space Age record.”

The collapse happened during the deep solar minimum of 2008-2009—a fact which comes as little surprise to researchers. The thermosphere always cools and contracts when solar activity is low. In this case, however, the magnitude of the collapse was two to three times greater than low solar activity could explain.

“Something is going on that we do not understand,” says Emmert.

The thermosphere ranges in altitude from 90 km to 600+ km. It is a realm of meteors, auroras and satellites, which skim through the thermosphere as they circle Earth. It is also where solar radiation makes first contact with our planet. The thermosphere intercepts extreme ultraviolet (EUV) photons from the sun before they can reach the ground. When solar activity is high, solar EUV warms the thermosphere, causing it to puff up like a marshmallow held over a camp fire. (This heating can raise temperatures as high as 1400 K—hence the name thermosphere.) When solar activity is low, the opposite happens.” NASA

In summary, “the Pause” is apparent in Earth’s atmospheric record, Lower and Middle Troposphere appear to have warmed slowly, overlaid with the El Niño/La Niña Southern Oscillation (ENSO) cycle, including four comparatively large El Niño events, and tempered by the cooling effects of the eruption of El Chichon (1982) and Mt Pinatubo (1991). Lower and Middle Tropospheric temperatures appear to have paused since the large El Niño in 1998. Tropospheric / Stratospheric temperatures appear to have been influenced by at least three significant surface driven warming events, the 1997-98 El Niño, and the eruptions of El Chichon in 1982 and Mt Pinatubo in 1991, but have maintained a stable overall trajectory. Stratospheric temperatures appear to have experienced two “dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).”, and “unusual step-like behavior of global-mean stratospheric temperatures” which has resulted in a significant stratospheric cooling during the last 30 years. Lastly, “during deep solar minimum of 2008-2009″ “the biggest contraction of the thermosphere in at least 43 years” occurred and “The magnitude of the collapse was two to three times greater than low solar activity could explain.”

Ocean Temperatures:

“The oceans can hold much more heat than the atmosphere. Just the top 3.2 metres of ocean holds as much heat as all the world’s air.” Commonwealth of Australia – Bureau of Meteorology

From a surface perspective Hadley Center’s HadSST2 Global Sea Surface Temperature Anomaly;

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

NOAA’s – National Climate Data Center – Global Sea Surface Temperature Anomaly;

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

Reynolds OI.v2 Global Sea Surface Temperature Anomaly

Bob Tisdale – http://bobtisdale.wordpress.com – Click the pic to view at source

all appear to be well into The Pause.

Obviously Sea Surface temperature only scratch the surface, thus changes in Ocean Heat Content are important in understanding “Earth’s Temperature”. Here is NOAA’s NODC Global Ocean Heat Content from 0-700 Meters – 1955 to Present;

National Oceanic & Atmospheric Administration (NOAA) – National Oceanographic Data Center (NODC) – Click the pic to view at source

and here is the same from Ole Humlum’s valuable climate data site Climate4you.com, NODC Global Ocean Heat Content – 0-700 Meters – 1979 to Present:

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

It seems apparent from the plots above that Global Ocean Heat has increased over the last several decades, and has not paused per se, however the rate of increase seems to have slowed significantly since 2004.

Sea Level:

“Global sea level is currently rising as a result of both ocean thermal expansion and glacier melt, with each accounting for about half of the observed sea level rise, and each caused by recent increases in global mean temperature. For the period 1961-2003, the observed sea level rise due to thermal expansion was 0.42 millimeters per year and 0.69 millimeters per year due to total glacier melt (small glaciers, ice caps, ice sheets) (IPCC 2007). Between 1993 and 2003, the contribution to sea level rise increased for both sources to 1.60 millimeters per year and 1.19 millimeters per year respectively (IPCC 2007).” Source NSIDC

Global Mean Sea Level Change – 1993 to Present:

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

Global Mean Sea Level Change Map with a “Correction” of 0.3 mm/year added May, 5th 2011, due to a “Glacial Isostatic Adjustment (GIA)” – 1993 to Present;

University of Colorado at Boulder – Click the pic to view at source

While it appears that Sea Level Rise has continued recently;

Wikipedia – Click the pic to view at source

it is important to note that Sea Levels were increasing at a similar pace during the first half of the 20th century, before anthropogenic CO2 emissions were sufficient to have a significant influence on “Earth’s Temperature” and Sea Level:

Snow and Ice:

A proxy often cited when measuring “Earth’s Temperature” is amount of Snow and Ice on Earth. According to the United States Geological Survey (USGS), “The vast majority, almost 90 percent, of Earth’s ice mass is in Antarctica, while the Greenland ice cap contains 10 percent of the total global ice mass.” Source USGA

However, there is currently no generally accepted measure of ice volume, as Cryosat is still in validation and the accuracy of measurements from Grace are still being challenged. Sea Ice Area and Extent are cited as proxies for “Earth’s Temperature”, however there is significant evidence that the primary influences on Sea Ice Area and Extent are in fact wind and Atmospheric Oscillations.

With this said, Global Sea Ice Area;

Cryosphere Today – University of Illinois – Polar Research Group – Click the pic to view at source

had it’s largest maximum in 2013, since 1996 and has remained stubbornly average for the entirety of 2013. Antarctic Sea Ice Extent has remained above the 1981 – 2010 “normal” range for much of the last four months;

National Snow & Ice Data Center (NSIDC) – Click the pic to view at source

we had the third most expansive Southern Sea Ice Area measured to date;

Cryosphere Today – Arctic Climate Research at the University of Illinois – Click the pic to view at source

and Southern Sea Ice Area has remained above average for almost all of the last two years:

Cryosphere Today – Arctic Climate Research at the University of Illinois – Click the pic to view at source

At the other pole Arctic Sea Ice Extent has remained within the 1981 – 2010 “normal” range for the entirety of 2013;

National Snow & Ice Data Center (NSIDC) – click to view at source

and Northern Hemisphere Sea Ice Area had it’s smallest decline since 2006:

Cryosphere Today – University of Illinois – Polar Research Group – Click the pic to view at source

There appears to have been a negative trend in Northern Hemisphere Sea Ice Area and Extent, a positive trend in Southern Hemisphere Sea Ice Area and Extent, thus the resultant Global Sea Ice Area trend appears to be slightly negative. However, in the last 6 years there does appear to be a Pause in Global Sea Ice Area.

In terms of land based data, here is 20 Year Northern Hemisphere Snow Cover with 1995 – 2009 Climatology from NCEP/NCAR;

Florida State University – Department of Earth, Ocean, and Atmospheric Science – Click the pic to view at source

Northern Hemisphere Snow Cover Anomalies 1966 – Present from NCEP/NCAR;

Florida State University – Department of Earth, Ocean, and Atmospheric Science – Click the pic to view at source

Northern Hemisphere Winter Snow Extent – 1967 to Present from Rutgers University;

Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

Northern Hemisphere Spring Snow Extent – 1967 to Present:

 alt=

Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

Northern Hemisphere Fall Snow Extent – 1967 to Present:

Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

While none of the Snow plots offers a global perspective, when looking at the Northern Hemisphere, there appears to have been a slight increase in Winter Snowcover and Snow Extent, a decrease in Spring Snow Extent and no change in Fall Snow Extent over the historical record.

Based on the limited Global Ice and Snow measurements available, and noting the questionable value of Sea Ice Area and Extent as a proxy for temperature, not much inference can currently be drawn from Earth’s Ice and Snow measurements. However, there does appear to be a Pause in Global Sea Ice Area.

Conclusion:

The Pause in “Earth’s Temperature” appears in many of Earth’s observational records, it appears to extend for between 6 – 16 years depending on the data set and what it is being measured.

Additional information on “Earth’s Temperature” can be found in the WUWT Reference Pages, including the Global Temperature Page and Global Climatic History Page

Please note that WUWT cannot vouch for the accuracy of the data/graphics within this article, nor influence the format or form of any of the graphics, as they are all linked from third party sources and WUWT is simply an aggregator. You can view each graphic at its source by simply clicking on it.

[Ed. Note: Cross posted from WUWT. This is a large, valuable, albeit technical discussion about the validity of the Global Warming meme so righteously promoted by mainstream media, and variety of catastrophic climate scientists advocates. The comment stream is also worthwhile for the links and discussions within. ]

A Big Picture Look At “Earth’s Temperature” – “Extreme Weather” Update

By WUWT regular “Just The Facts”

Recently there have been increased efforts to link “Climate Change” and “Extreme Weather” e.g., NOAA links extreme weather to climate change CBS – July 10, 2012, “NASA scientist links climate change, extreme weather” CNN – August 6, 2012 and Get used to ‘extreme’ weather, it’s the new normal The Guardian – September 19, 2012.  Per the Guardian article, “Scientists have been warning us for years that a warmer planet would lead to more extreme weather, and now it’s arrived”. These “Extreme Weather” efforts have shifted into high gear with Sandy. Yesterday United Nations Secretary-General Ban Ki-moon said that “one of the lessons from Superstorm Sandy is the need for global action to deal with future climate shocks.” “He told the U.N. General Assembly on Friday that it is difficult to attribute any single storm to climate change, but the world already knows that “extreme weather due to climate change is the new normal.” U.N. leader: Sandy a lesson in climate change CBS – November 9, 2012

All of these claims and “extreme weather” rhetoric seems to be predicated on the assumption that “Earth’s Temperature” has increased recently, thus causing “extreme weather” to arrive and become the “new normal”. However, does the observational data support this assumption? Let’s take a look…

 

Global Surface Temperatures:

Generally, when referring to Earth’s “climate” warming, proponents of the Catastrophic Anthropogenic Global Warming (CAGW) narrative refer to Earth’s Surface Temperature, e.g. “Global warming is the unusually rapid increase in Earth’s average surface temperature over the past century primarily due to the greenhouse gases released by people burning fossil fuels.” NASA Earth Observatory

As such, here’s NASA’s Goddard Institute for Space Studies (GISS) Global Monthly Mean Surface Temperature Anomaly – 1996 to Present:

National Aeronautics and Space Administration (NASA) Goddard Institute for Space Studies (GISS) – Click the pic to view at source

Looking across the last 16 years, Global Surface Temperature do not appear to have increased much at all.

For a longer term view, UK Met Office’s – Hadley Center – Climate Research Unit (CRU) Annual Global Average Land Temperature Anomaly – 1850 to 2011;

Met Office – Hadley Center – Click the pic to view at source

and the UK Met Office – Hadley Center – Climate Research Unit (CRU) Monthly Global Average Land Temperature – 1850 to 2011

Met Office – Hadley Center – Click the pic to view at source

Unless the arrival of “extreme weather” occurred in 1997-1998 with the well documented “very strong El Niño”, and the media is just realizing it, there does not seem to be a basis for the “extreme weather” claims in Earth’s recent Land and Surface Temperature record. There does not appear to be much recent change, and if anything the trend is down in the last few years. However, the surface temperature record is burdened with issues of questionable siting, changes in siting, changes in equipment, changes in the number of measurement locations, modeling to fill in gaps in measurement locations, corrections to account for missing, erroneous or biased measurements, and the urban heat island effect. Thus to see the big picture on the temperature “Earth’s Temperature”, it also helps to look up.

Atmospheric Temperatures:

Since 1979 Earth’s “temperature” has also been measured via satellite. “The temperature measurements from space are verified by two direct and independent methods. The first involves actual in-situ measurements of the lower atmosphere made by balloon-borne observations around the world. The second uses intercalibration and comparison among identical experiments on different orbiting platforms. The result is that the satellite temperature measurements are accurate to within three one-hundredths of a degree Centigrade (0.03 C) when compared to ground-launched balloons taking measurements of the same region of the atmosphere at the same time.” NASA

Here is RSS Global Temperature Lower Troposphere (TLT) – Brightness Temperature Anomaly- 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

and this is the University of Alabama – Hunstville (UAH) Global Lower Atmosphere Temperature Anomalies – 1979 to Present:

University of Alabama – Huntsville (UAH) – Dr. Roy Spencer – Click the pic to view at source

Note: Per John Christy, RSS and UAH anomalies are not comparable because they use different base periods, i.e., “RSS only uses 1979-1998 (20 years) while UAH uses the WMO standard of 1981-2010.”

The September UAH Lower Atmosphere Temperature Anomaly was .33 degrees C above the 30 year average and RSS Global Global Lower Troposphere shows a .133  degrees C increase per decade. “Earth’s Temperature” varies naturally by numerous degrees and has been significantly warmer than it is today:

NOAA – National Climate Data Center – Click the pic to view at source

Are we to believe that 3 or 4 tenths of a degree C warming over the last 30 years has brought us to the precipice of “extreme weather”? Seems implausible. Maybe there are significant regional variations that portended the arrival of “extreme weather”?

Looking at the RSS Northern Hemisphere Temperature Lower Troposphere (TLT) Brightness Temperature Anomaly;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

and RSS Southern Hemisphere Temperature Lower Troposphere (TLT) Brightness Temperature Anomaly;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

neither seem indicative of warming that would have caused “extreme weather” to arrive.

Furthermore, RSS Southern Polar Temperature Lower Troposphere (TLT) Brightness Temperature Anomaly;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

is currently negative and shows a .013 K/C per decade decrease. Should we assume that Antarctica is experiencing less “extreme weather” at the moment?…

To this point we’ve only addressed the Lower Troposphere Temperatures, but one never knows where this “extreme weather” might be coming from, the following Temperature Anomaly plots from RSS will increase in altitude as is illustrated here:

Here is RSS Temperature Middle Troposphere (TMT)- Brightness Temperature Anomaly- 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

According to Remote Sensing Systems, “For Channel (TLT) (Lower Troposphere) and Channel (TMT) (Middle Troposphere), the anomaly time series is dominated by ENSO events and slow tropospheric warming. The three primary El Niños during the past 20 years are clearly evident as peaks in the time series occurring during 1982-83, 1987-88, and 1997-98, with the most recent one being the largest.” RSS

Middle Tropospheric temperatures appear to show slow warming overlaid with the El Niño/La Niña Southern Oscillation (ENSO) cycle, including several comparatively large El Niño events. Middle Tropospheric temperatures appear to have flattened since the large El Niño in 1998 and offer no indication of a recent change in Earth’s Temperature that could cause “extreme weather” to become the “new normal.

Moving higher in the atmosphere, RSS Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been flat since 1987, with a trend of just -.008 K/C per decade. Perhaps this is the “new normal”?…

The 1997-98 and 2009 – 10 El Niño events are still readily apparent in the Troposphere / Stratosphere plot above, as is a spike from the 1991 eruption of Mt. Pinatubo. Note that the effect of Mt. Pinatubo is the opposite in the Lower and Middle Troposphere versus the Troposphere / Stratosphere (TTS), i.e. “Large volcanic eruptions inject sulfur gases into the stratosphere; the gases convert into submicron particles (aerosol) with an e-folding time scale of about 1 year. The climate response to large eruptions (in historical times) lasts for several (2-3) years. The aerosol cloud causes cooling at the Earth’s surface, warming in stratosphere.”
Ellen Thomas, PHD Wesleyan University

It is interesting that, incorporating the impact of three significant surface driven warming events, Troposphere / Stratosphere Temperatures (TTS) have been quite stable, however there is a bit of regional variation here, e.g.:

RSS Northern Hemisphere Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been increasing by .044 K/C per decade, whereas the RSS Southern Hemisphere Temperature Troposphere / Stratosphere (TTS) – Brightness Temperature Anomaly- 1987 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

has been decreasing by -.061 K/C per decade. However, Southern Hemisphere Troposphere / Stratosphere Temperature does show a significant increase in 2012, perhaps it is this increase that caused “extreme weather” to arrive? Or maybe not…

Moving higher still in the atmosphere, the RSS Temperature Lower Stratosphere (TLS) – Brightness Temperature Anomaly – 1979 to Present;

Remote Sensing Systems (RSS) – Microwave Sounding Units (MSU) – Click the pic to view at source

“is dominated by stratospheric cooling, punctuated by dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).” RSS

The eruptions of El Chichon and Mt Pinatubo are readily apparent in the Apparent Atmospheric Transmission of Solar Radiation at Mauna Loa, Hawaii:

National Oceanic and Atmospheric Administration (NOAA) – Earth System Research Laboratory (ESRL) – Click the pic to view at source

“The stratosphere” … “in contrast to the troposphere, is heated, as the result of near infrared absorption of solar energy at the top of the aerosol cloud, and increased infra-red absorption of long-wave radiation from the Earth’s surface.”

“The stratospheric warming in the region of the stratospheric cloud increases the latitudinal temperature gradient after an eruption at low latitudes, disturbing the stratospheric-troposphere circulation, increasing the difference in height of the troposphere between high and low latitudes, and increasing the strength of the jet stream (polar vortex, especially in the northern hemisphere). This leads to warming during the northern hemisphere winter following a tropical eruption, and this warming effect tends to be larger than the cooling effect described above.” Ellen Thomas, PHD Wesleyan University

The Lower Stratosphere experienced “dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).” RSS “The long-term, global-mean cooling of the lower stratosphere stems from two downward steps in temperature, both of which are coincident with the cessation of transient warming after the volcanic eruptions of El Chichon and Mt. Pinatubo.” … “Here we provide observational analyses that yield new insight into three key aspects of recent stratospheric climate change. First, we provide evidence that the unusual step-like behavior of global-mean stratospheric temperatures is dependent not only upon the trend but also on the temporal variability in global-mean ozone immediately following volcanic eruptions. Second, we argue that the warming/cooling pattern in global-mean temperatures following major volcanic eruptions is consistent with the competing radiative and chemical effects of volcanic eruptions on stratospheric temperature and ozone. Third, we reveal the contrasting latitudinal structures of recent stratospheric temperature and ozone trends are consistent with large-scale increases in the stratospheric overturning Brewer-Dobson circulation” David W. J. Thompson Colorado State University

Above the Stratosphere we have the Mesosphere and Thermosphere, neither of which have I identified current temperature time series for, but of note is that on “July 15, 2010″ “A Puzzling Collapse of Earth’s Upper Atmosphere” occurred when “high above Earth’s surface where the atmosphere meets space, a rarefied layer of gas called “the thermosphere” recently collapsed and now is rebounding again.”

“This is the biggest contraction of the thermosphere in at least 43 years,” says John Emmert of the Naval Research Lab, lead author of a paper announcing the finding in the June 19th issue of the Geophysical Research Letters (GRL). “It’s a Space Age record.”

The collapse happened during the deep solar minimum of 2008-2009—a fact which comes as little surprise to researchers. The thermosphere always cools and contracts when solar activity is low. In this case, however, the magnitude of the collapse was two to three times greater than low solar activity could explain.

“Something is going on that we do not understand,” says Emmert.

The thermosphere ranges in altitude from 90 km to 600+ km. It is a realm of meteors, auroras and satellites, which skim through the thermosphere as they circle Earth. It is also where solar radiation makes first contact with our planet. The thermosphere intercepts extreme ultraviolet (EUV) photons from the sun before they can reach the ground. When solar activity is high, solar EUV warms the thermosphere, causing it to puff up like a marshmallow held over a camp fire. (This heating can raise temperatures as high as 1400 K—hence the name thermosphere.) When solar activity is low, the opposite happens.” NASA

In summary, Earth’s Lower and Middle Troposphere appear to have warmed slowly, overlaid with the El Niño/La Niña Southern Oscillation (ENSO) cycle, including four comparatively large El Niño events, and tempered by the cooling effects of the eruption of El Chichon (1982) and Mt Pinatubo (1991). Lower and Middle Tropospheric temperatures appear to have flattened since the large El Niño in 1998 and offer no indication of changes that could be causing “extreme weather”. Tropospheric / Stratospheric temperatures appear to have been influenced by at least three significant surface driven warming events, the 1997-98 El Niño, and the eruptions of El Chichon in 1982 and Mt Pinatubo in 1991, but have maintained a stable overall trajectory. Stratospheric temperatures appear to have experienced two “dramatic warming events caused by the eruptions of El Chichon (1982) and Mt Pinatubo (1991).”, and “unusual step-like behavior of global-mean stratospheric temperatures” which has resulted in a significant stratospheric cooling during the last 30 years. Lastly, “during deep solar minimum of 2008-2009″ “the biggest contraction of the thermosphere in at least 43 years” occurred and “The magnitude of the collapse was two to three times greater than low solar activity could explain.” Unless someone can demonstrate a causative relationship between “Climate Change”, the collapse of the thermosphere and “Extreme Weather”, there does not seem to be any support with the atmospheric temperature records for “extreme weather” arrival and “new normal” rhetoric.

Ocean Temperatures:

“The oceans can hold much more heat than the atmosphere. Just the top 3.2 metres of ocean holds as much heat as all the world’s air.” Commonwealth of Australia – Bureau of Meteorology

As such, changes in Ocean Heat Content are important in understanding “Earth’s Temperature”. Here is NOAA’s NODC Global Ocean Heat Content from 0-700 Meters – 1955 to Present;

National Oceanic & Atmospheric Administration (NOAA) – National Oceanographic Data Center (NODC) – Click the pic to view at source

and here is the same from Ole Humlum’s valuable climate data site Climate4you.com, NODC Global Ocean Heat Content – 0-700 Meters – 1979 to Present:

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

It seems apparent from the plots above that Global Ocean Heat has increased over the last several decades, however Global Ocean Heat does not appear to show a recent increase that could lead to “extreme weather”. Furthermore, in his recent article Bob Tisdale demonstrated that “sea surface temperatures for Sandy’s path haven’t warmed in 70+ years” WUWT.

Sea Level:

“Global sea level is currently rising as a result of both ocean thermal expansion and glacier melt, with each accounting for about half of the observed sea level rise, and each caused by recent increases in global mean temperature. For the period 1961-2003, the observed sea level rise due to thermal expansion was 0.42 millimeters per year and 0.69 millimeters per year due to total glacier melt (small glaciers, ice caps, ice sheets) (IPCC 2007). Between 1993 and 2003, the contribution to sea level rise increased for both sources to 1.60 millimeters per year and 1.19 millimeters per year respectively (IPCC 2007).” Source NSIDC

Global Mean Sea Level Change – 1993 to Present:

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

Global Mean Sea Level Change Map with a “Correction” of 0.3 mm/year added May, 5th 2011, due to a “Glacial Isostatic Adjustment (GIA)” – 1993 to Present:

University of Colorado at Boulder – Click the pic to view at source

It seems doubtful that “extreme weather” arrived because of the 5.5 Centimeter increase in Sea Level since 1993. Sandy’s storm surge topped “out at 14 feet (4.3 meters)” Huffington Post, would Sandy have been less extreme if the surge had only been 4.245 meters?…

Snow and Ice:

A proxy often cited when measuring “Earth’s Temperature” is amount of Snow and Ice on Earth. According to the United States Geological Survey (USGS), “The vast majority, almost 90 percent, of Earth’s ice mass is in Antarctica, while the Greenland ice cap contains 10 percent of the total global ice mass.” Source USGA

However, there is currently no generally accepted measure of ice volume, as Cryosat is still in validation and the accuracy of measurements from Grace are still being challenged. Sea Ice Area and Extent are cited as proxies for “Earth’s Temperature”, however there is significant evidence that the primary influences on Sea Ice Area and Extent are in fact wind and Atmospheric Oscillations. With this said, here are

Global, Arctic & Antarctic Sea Ice Area from 1979 to Present;

climate4you.com – Ole Humlum – Professor, University of Oslo Department of Geosciences – Click the pic to view at source

Global Sea Ice Area Anomaly – 1979 to Present:

Cryosphere Today – Arctic Climate Research at the University of Illinois – Click the pic to view at source

Northern Hemisphere Sea Ice Area Anomaly, 1979 to Present;

Cryosphere Today – Arctic Climate Research at the University of Illinois – Click the pic to view at source

Southern Hemisphere Sea Ice Area Anomaly, 1979 to Present;

Cryosphere Today – Arctic Climate Research at the University of Illinois – Click the pic to view at source

Arctic Sea Ice Extent – 15% or greater

National Snow & Ice Data Center (NSIDC) – click to view at source

Antarctic Sea Ice Extent – 15% or Greater

National Snow & Ice Data Center (NSIDC) – Click the pic to view at source

There appears to have been a negative trend in Northern Hemisphere Sea Ice Area and Extent and a positive trend in Southern Hemisphere Sea Ice Area and Extent, thus the resultant Global Sea Ice Area trend appears to be slightly negative.

In terms of land based data, here is 20 Year Northern Hemisphere Snow Cover with 1995 – 2009 Climatology from NCEP/NCAR;

Florida State University – Department of Earth, Ocean, and Atmospheric Science – Click the pic to view at source

Northern Hemisphere Snow Cover Anomalies 1966 – Present from NCEP/NCAR;

Florida State University – Department of Earth, Ocean, and Atmospheric Science – Click the pic to view at source

Northern Hemisphere Winter Snow Extent – 1967 to Present from Rutgers University;

Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

Northern Hemisphere Spring Snow Extent – 1967 to Present:

 alt=Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

Northern Hemisphere Fall Snow Extent – 1967 to Present:

Rutgers University – Global Snow Lab (GSL) – Click the pic to view at source

While none of the Snow plots offers a global perspective, when looking at the Northern Hemisphere, there appears to have been a slight increase in Snowcover and Winter Snow Extent, a decrease in Spring Snow Extent and no change in Fall Snow Extent over the historical record.

Based on the limited Global Ice and Snow measurements available, and noting the questionable value of Sea Ice Area and Extent as a proxy for temperature, not much inference can currently be drawn from Earth’s Ice and Snow measurements. However, there does not appear to be any evidence of change in Earth’s Ice and Snow measurements indicative of the arrival of “Extreme Weather”.

Conclusion:

There is no evidence of a recent increase in “Earth’s Temperature” due to “Climate Change,” which could have caused “Extreme Weather” to arrive and become the “new normal”. Claims and rhetoric that recent “Extreme Weather” is caused by or associated with “Climate Change” are not supported by the observational data.

Additional information on “Earth’s Temperature” can be found in the WUWT Reference Pages, including the Global Temperature Page and Global Climatic History Page

Please note that WUWT cannot vouch for the accuracy of the data/graphics within this article, nor influence the format or form of any of the graphics, as they are all linked from third party sources and WUWT is simply an aggregator. You can view each graphic at its source by simply clicking on it.

by Anthony Watts (originally posted April 29, 2011 on WUWT).

In times of tragedy, there always seems to be hucksters about trying to use that tragedy to sell a position, a product, or a belief. In ancient times, tragedy was the impetus used to appease the gods and to embrace religion to save yourselves. In light of this article on the Daily Caller Center for American Progress blames Republicans for devastating tornadoes it seems some opportunists just can’t break the pattern of huckster behavior in the face of disaster.

I can’t think of a more disgusting example of political opportunism that has occurred such as we witnessed today from The Center for American Progress via their Think Progress blog, as well as the New York Times op-ed piece that suggests predicting severe weather is little more than a guessing game. Certified Consulting Meteorologist Mike Smith of Wichita, KS based WeatherData Inc. said of the NYT piece:

The cruelty of this particular April, in the number of tornadoes recorded, is without equal in the United States.

This may or may not be true. The statement is at least premature. The NWS Storm Prediction Center March 8th changed its methodology which allows more reports of tornadoes and other severe storms to be logged (see first note here). We don’t know yet whether this is a record April.

Tornadoes in particular, researchers say, straddle the line between the known and the profoundly unknowable.

“There’s a large crapshoot aspect,” said Kevin Trenberth, a senior scientist at the National Center for Atmospheric Research in Boulder, Colo.

To add to the mix, Peter Gleick says at the Huffington Post  “More extreme and violent climate is a direct consequence of human-caused climate change (whether or not we can determine if these particular tornado outbreaks were caused or worsened by climate change).”

In the Think Progress piece, again, Dr. Trenberth is quoted:

“Given that global warming is unequivocal,” climate scientist Kevin Trenberth cautioned the American Meteorological Society in January of this year, “the null hypothesis should be that all weather events are affected by global warming rather than the inane statements along the lines of ‘of course we cannot attribute any particular weather event to global warming.’”

It should also be noted that during that AMS conference in January, Dr. Trenberth called people who disagreed with that view “deniers” in front of hundreds of scientists, even after being called out on the issue he left the hateful term intact in his speech. Clearly, he is a man with a bias. From my perspective, these articles citing Trenberth are opportunistic political hucksterism at its finest. Unfortunately, many from these bastions of left leaning opininators don’t bother to cite some inconvenient facts, leaving their claims to be on par with superstitions that were the part of our dark past.

First, let’s look at the claim of tornadoes being on the increase, in parallel with the climate change that is claimed. In my previous essay Severe weather more common? Data shows otherwise I cited this graph from the National Climatic Data Center:

Obviously, when NCDC tallies the number of F3-F5 tornadoes from this recent outbreak, and gets around to updating that graph, there will be an uptick at the end in 2011 that is on par or even higher than the famous 1974 tornado outbreak. The point though is that despite the 1974 uptick, the trend was down.

The NYT article says:

The population of the South grew by 14.3 percent over the last decade, according to the Census Bureau, compared with 9.7 percent for the nation as a whole. Of those states hardest hit by tornadoes this year, some were among the fastest growing, notably Texas and North Carolina.

Let’s look at trends of tornado related deaths with population. From Harold Brooks. a research meteorologist with the NOAA National Severe Storms Laboratory in Norman, Oklahoma. we have this graph:

Source: NOAA’s US Severe Weather Blog, SPC, Norman Oklahoma

Let’s look at other figures. Today, Dr. Roger Pielke Junior got an updated graph from Harold Brooks at NOAA to bring it to 2010:

That graph is a testament to the improved lead times, accuracy, and and dissemination of severe weather warnings by the National Weather Service, whose members did an outstanding job during this severe weather event. CCM Mike Smith, in his book Warnings The True Story of How Science Tamed the Weather talks about the vast improvements we’ve witnessed since the early days of severe weather forecasting. He writes today of the recent outbreak:

There is no question that the current storm warning program, a collaborative effort of the National Weather Service, private sector weather companies like AccuWeather, broadcast meteorologists, and local emergency managers have saved hundreds of lives during these recent storms through excellent forecasts and warnings.  This image shows the tornado warning (red hatched area) for Birmingham that was issued more than 20 minutes before the tornado arrived.

Can the warning program be improved? Certainly. The National Weather Service’s new dual-polarization radar will improve flash flood warnings and will incrementally improve warnings of tornadoes that occur after dark.

But in the immediate aftermath of these tragic storms we seem to have learned two things:  People need to respond to today’s highly accurate warnings. For some reason, the media (see examples here and here seems determined to downplay the quality of the warnings which may have the effect of driving down response rates.

Second, they must have a place to take shelter. Most mobile home parks and many homes in the South do not have underground shelters or safe rooms. Mobile home parks and housing developments should look to constructing these in the future.

With 30 minutes of advance warning in this case, and many other advance warnings during this outbreak, plus the supersaturation of live television coverage, plus the fact that weeks in advance, my colleague Joe D’Aleo, co-founder of the Weather Channel and now at Weatherbell LLC,  discussed the likelihood of a super-outbreak of severe weather occurring due to the juxtaposition of cold air from snowpack in the northern plains with warm moist air in the south, it would seem Dr. Trenberth’s claim of “a large crapshoot aspect” doesn’t hold up. The death toll issue seems to be shelter, not lack of forecasts, warnings, or awareness. People knew the storms were coming, they just had few options for shelters that would survive at F3-F5 category tornado intensity.

The attempts at linking the tornado outbreak this week to “global warming” have been roundly criticized in the meteorological community. Just yesterday there was a denouncement of the tornadoes to global warming link in this story from Physorg.com

“If you look at the past 60 years of data, the number of tornadoes is increasing significantly, but it’s agreed upon by the tornado community that it’s not a real increase,” said Grady Dixon, assistant professor of meteorology and climatology at Mississippi State University.

“It’s having to do with better (weather tracking) technology, more population, the fact that the population is better educated and more aware. So we’re seeing them more often,” Dixon said.

But he said it would be “a terrible mistake” to relate the up-tick to climate change.

Anticipating this sort of nonsense in the current political climate that seeks to blame humans for the weather, last month, the National Weather Association, representing thousands of operational meteorologists, forecasters, and television-radio meteorologists in the United States adopted their first ever position statement on climate change and severe weather events. They state:

Any given weather event, or series of events, should not be construed as evidence of climate change.

The NWA emphasizes that no single weather event or series of events should be construed as evidence of a climate trend. Daily weather is subject to extreme events due to its natural variability. It is only the occurrence of these events over decades that determines a climate trend.

No clearer statement could be rendered. It mirrors what a NOAA scientist at the Storm Prediction Center said yesterday to Fox News:

Greg Carbin, the warning coordination meteorologist at NOAA’s Storm Prediction Center in Norman, Oklahoma, said warming trends do create more of the fuel that tornadoes require, such as moisture, but that they also deprive tornadoes of another essential ingredient: wind shear.

“We know we have a warming going on,” Carbin told Fox News in an interview Thursday, but added: “There really is no scientific consensus or connection [between global warming and tornadic activity]….Jumping from a large-scale event like global warming to relatively small-scale events like tornadoes is a huge leap across a variety of scales.”

Asked if climate change should be “acquitted” in a jury trial where it stood charged with responsibility for tornadoes, Carbin replied: “I would say that is the right verdict, yes.” Because there is no direct connection as yet established between the two? “That’s correct,” Carbin replied.

Historically, there have been many tornado outbreaks that occurred well before climate change was on anyone’s radar.  Here’s a few:

1908 Southeast tornado outbreak 324 fatalities, ≥1,720 injuries

1920 Palm Sunday tornado outbreak ≥380 fatalities, ≥1215 injuries

1925 Tri-State tornado ≥747 fatalities, ≥2298 injuries

1932 Deep South tornado outbreak  ≥330 fatalities, 2145 injuries

1952 Arkansas-Tennessee tornado outbreak 208 fatalities

1965 Palm Sunday tornado outbreak 256 fatalities

April 3-4 1974 Super Outbreak 315 fatalities

All of these occurred before “climate change” was even on the political radar. What caused those if “global warming” is to blame? The real cause is La Niña, and as NOAAwatch.gov indicates on their page with the helpful meter, we are in a La Niña cycle of ocean temperature in the Pacific.

Here’s what it looks like on satellite measurements. Notice the cool blue:

The US. Climate Prediction Center talks about the reason for such outbreaks in relation to ocean temperature cycles:

What impacts do El Niño and La Niña have on tornado activity across the country?

Since a strong jet stream is an important ingredient for severe weather, the position of the jet stream helps to determine the regions more likely to experience tornadoes. Contrasting El Niño and La Niña winters, the jet stream over the United States is considerably different. During El Niño the jet stream is oriented from west to east across the southern portion of the United States. Thus, this region becomes more susceptible to severe weather outbreaks. During La Niña the jet stream and severe weather is likely to be farther north.

Note the collision zone in the US southeast during La Niña patterns.

Finally, Let’s examine the claims of global warming being linked to the tornado outbreak. If this were true, we’d expect the globe to be warmer, right?

Thunderstorms (and all weather for that matter) form in the troposphere, that layer of the atmosphere that is closest to the surface, and extends up to the stratosphere.

Image: weatherquestions.com – click for details

Dr. Roy Spencer, climate scientist from the University of Alabama, Huntsville, tracks the temperature of the troposphere. The university system that he tracks the temperature daily with is inoperable, due to the storms. People who have been watching it prior to this event know the current global tropospheric temperature is lower in April than the norm, but we can’t show it today. The last global value he plotted showed this:

https://i0.wp.com/www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_Mar_2011.gif
The global temperature anomaly of the troposphere today is about the same as it was in 1979. If there’s any global warming in the troposphere, it must be a figment of an overactive imagination on the part of people who seek to link it to the recent tornado tragedy.

Dr. Roy Spencer sums it up pretty well on his blog today:

MORE Tornadoes from Global Warming? That’s a Joke, Right?

It is well known that strong to violent tornado activity in the U.S. has decreased markedly since statistics began in the 1950s, which has also been a period of average warming. So, if anything, global warming causes FEWER tornado outbreaks…not more. In other words, more violent tornadoes would, if anything, be a sign of “global cooling”, not “global warming”.

Anyone who claims more tornadoes are caused by global warming is either misinformed, pandering, or delusional.

The people that seek to link this tragedy to the political movement of climate change should be ashamed of themselves. The only “deniers” here are the ones who deny all the long established counter evidence of their bogus claims for political gain.

——————

For those who wish to help with this tragedy there are options:

There’s a service called safeandwell.org which can help you get status on relatives and friends who may be affected.

There are several ways to register or look for messages from those affected by a disaster:

  • From a computer, visit www.redcross.org and click on the “List Yourself or Search Registrants” link under “How to Get Help.”
  • From a smart phone, visit www.redcross.org/safeandwell.
  • Call 1-800-RED CROSS (1-800-733-2767) to register.

There is of course financial help needed for the relief efforts of the American Red Cross. Text REDCROSS to 90999 to donate $10 to relief efforts from your cell phone bill. Or visit the main website.

by Anthony Watts (originally published April 19, 2011 on WUWT)

Dr. Roger Pielke Jr  on his Blog, April 18th writes:

A new analysis of floods around the world has been called to my attention. The new analysis is contrary to conventional wisdom but consistent with the scientific literature on global trends in peak streamflows. Is it possible that floods are not increasing or even in decline while most people have come to believe the opposite?

Bouziotas et al. presented a paper at the EGU a few weeks ago (PDF) and concluded:

Analysis of trends and of aggregated time series on climatic (30-year) scale does not indicate consistent trends worldwide. Despite common perception, in general, the detected trends are more negative (less intense floods in most recent years) than positive. Similarly, Svensson et al. (2005) and Di Baldassarre et al. (2010) did not find systematical change neither in flood increasing or decreasing numbers nor change in flood magnitudes in their analysis.

Note the phrase I highlighted: “Despite common perception”.  I was very pleased to see that in context with a conclusion from real data.

That “common perception” is central to the theme of “global climate disruption”, started by John P. Holdren in this presentation, which is one of the new buzzword phrases after “global warming” and “climate change” used to convey alarm.

Like Holdren, many people who ascribe to doomsday scenarios related to AGW seem to think that severe weather is happening more frequently. From a perception not steeped in the history of television technology, web technology, and mass media, which has been my domain of avocation and business, I can see how some people might think this. I’ve touched on this subject before, but it bears repeating again and in more detail.

Let’s consider how we might come to think that severe weather is more frequent than before. Using this Wikipedia timeline as a start, I’ve created a timeline that tracks the earliest communications to the present, adding also severe weather events of note and weather and news technology improvements for context.

  • Prior to 3500BC – Communication was carried out through paintings of indigenous tribes.
  • 3500s BC – The Sumerians develop cuneiform writing and the Egyptians develop hieroglyphic writing
  • 16th century BC – The Phoenicians develop an alphabet
  • AD 26-37 – Roman Emperor Tiberius rules the empire from island of Capri by signaling messages with metal mirrors to reflect the sun
  • 105 – Tsai Lun invents paper
  • 7th century – Hindu-Malayan empires write legal documents on copper plate scrolls, and write other documents on more perishable media
  • 751 – Paper is introduced to the Muslim world after the Battle of Talas
  • 1305 – The Chinese develop wooden block movable type printing
  • 1450 – Johannes Gutenberg finishes a printing press with metal movable type
  • 1520 – Ships on Ferdinand Magellan‘s voyage signal to each other by firing cannon and raising flags.
  • 1776 The Pointe-à-Pitre hurricane was at one point the deadliest Atlantic hurricane on record. At least 6,000 fatalities occurred on Guadeloupe, which was a higher death toll than any known hurricane before it. It also struck Louisiana, but there was no warning nor knowledge of the deaths on Guadeloupe when it did. It also affected Antigua and Martinique early in its duration.
  • 1780 – The Great Hurricane of 1780, also known as Hurricane San Calixto is considered the deadliest Atlantic tropical cyclone of all time. About 22,000 people died when the storm swept over Martinique, St. Eustatius and Barbados between October 10 and October 16. Thousands of deaths also occurred offshore. Reports of this hurricane took weeks to reach US newspapers of the era.
  • 1793 – Claude Chappe establishes the first long-distance semaphore telegraph line
  • 1812 – The Aug. 19, 1812 New Orleans Hurricane that didn’t appear in the Daily National Intelligencer/(Washington, DC) until later September. Daily National Intelligencer. Sept. 22, 1812, p. 3. Dreadful Hurricane. The following letters present an account of the ravages of one of those terrific storms to which the Southern extreme of our continent is so subject. Extract of a letter from Gen. Wilkinson, dated New Orleans, August 22.
  • 1831 – Joseph Henry proposes and builds an electric telegraph
  • 1835 – Samuel Morse develops the Morse code
  • 1843 – Samuel Morse builds the first long distance electric telegraph line
  • 1844 – Charles Fenerty produces paper from a wood pulp, eliminating rag paper which was in limited supply
  • 1849 – Associated Press organizes Nova Scotia pony express to carry latest European news for New York newspapers
  • 1851 – The New York Times newspaper founded
  • 1876 – Alexander Graham Bell and Thomas A. Watson exhibit an electric telephone in Boston
  • 1877 – Thomas Edison patents the phonograph
  • 1889 – Almon Strowger patents the direct dial telephone
  • 1901 – Guglielmo Marconi transmits radio signals from Cornwall to Newfoundland
  • 1906 – Reginald Fessenden used a synchronous rotary-spark transmitter for the first radio program broadcast, from Ocean Bluff-Brant Rock, Massachusetts. Ships at sea heard a broadcast that included Fessenden playing O Holy Night on the violin and reading a passage from the Bible.
  • 1914 – teletype intrduced as a news tool The Associated Press introduced the “telegraph typewriter” or teletype into newsrooms in 1914, making transmission of entire ready to read news stories available worldwide.
  • 1920 – The first radio news program was broadcast August 31, 1920 by station 8MK in Detroit, Michigan, which survives today as all-news format station WWJ under ownership of the CBS network.
  • 1925 – John Logie Baird transmits the first television signal
  • 1928 – NBC completed the first permanent coast-to-coast radio network in the United States, linked by telephone circuits
  • 1935 – Associated Press launched the Wirephoto network, which allowed transmission of news photographs over telephone lines on the day they were taken.
  • 1942 – Hedy Lamarr and George Antheil invent frequency hopping spread spectrum communication technique
  • 1946 – The DuMont Television Network, which had begun experimental broadcasts before the war, launched what Newsweek called “the country’s first permanent commercial television network” on August 15, 1946
  • 1947 – Douglas H. Ring and W. Rae Young of Bell Labs proposed a cell-based approach which lead to “cellular phones
  • 1947 – July 27th. The WSR-1 weather surveillance radar, cobbled together from spare parts of the Navy AN/APS-2F radar was put into service in Norfolk, NE. It was later replaced by improved models WSR-3 and WSR-4
  • 1948 – Network TV news begins. Launched in February 1948 by NBC, Camel Newsreel Theatre was a 10-minute program anchored by John Cameron Swayze, and featured newsreels from Movietone News. CBS soon followed suit in May 1948 with a 15-minute program, CBS-TV News, anchored by Douglas Edwards and subsequently renamed Douglas Edwards with the News.
  • 1948 – The first successful “tornado forecast” issued, and successfully predicted the 1948 Tinker Air Force Base tornadoes which were two tornadoes which struck Tinker Air Force Base in Oklahoma City, Oklahoma on March 20 and March 25.
  • In 1953, Donald Staggs, an electrical engineer working for the Illinois State Water Survey, made the first recorded radar observation of a “hook echo” associated with a tornadic thunderstorm.
  • 1957 the WSR-57 the first ‘modern’ weather radar, is commissioned by the U.S. Weather Bureau
  • 1958 – Chester Carlson presents the first photocopier suitable for office use
  • 1960 – TIROS-1 the first successful weather satellite, and the first of a series of Television Infrared Observation Satellites, was launched at 6:40 AM EST[1] on April 1, 1960 from Cape Canaveral, Florida.
  • 1962 – The first satellite television signal was relayed from Europe to the Telstar satellite over North America.
  • 1963 – First geosynchronous communications satellite is launched, 17 years after Arthur C. Clarke‘s article
  • 1963 CBS Evening News establishes the standard 30 minute network news broadcast. On September 2, 1963, the show expanded from 15 to 30 minutes.
  • 1966 – Charles Kao realizes that silica-based optical waveguides offer a practical way to transmit light via total internal reflection
  • 1967 – The National Hurricane Center is established in the Miami, FL National Weather Service Forecast Office.
  • 1969 – The first hosts of ARPANET, Internet‘s ancestor, are connected.
  • 1969 – August 14-22 Hurricane Camille, a Category 5 storm, gets widespread network news coverage from correspondents “on the scene”.
  • 1969 – Compuserve, and early dialup text based bulletin board system is launched in Columbus, Ohio, serving just that city with a
  • 1971 – Erna Schneider Hoover invented a computerized switching system for telephone traffic.
  • 1971 – Ray Tomlinson is generally credited as having sent the first email across a network, initiating the use of the “@” sign to separate the names of the user and the user’s machine.
  • 1972 – Radio Shack stores introduce “The Weather Cube”, the first mass marketed weather alert radio. (page 77 here) allowing citizens to get weather forecasts and bulletins in their home for only $14.95
  • 1974 April 3rd – WCPO-TV in Cincinnati carries the “Sayler Park Tornado” live on television as it was crossing the Ohio river. It was part of the biggest tornado super outbreak in history. It is the largest tornado outbreak on record for a single 24-hour period. From April 3 to April 4, 1974, there were 148 tornadoes confirmed in 13 US states. Lack of timely warnings demonstrated the need for an expanded NOAA weather radio warning system.
  • 1974 – The first Synchronous Meteorological Satellite SMS-1 was launched May 17, followed later by GOES-1 in 1975.
  • 1974 the WSR-74 the second modern radar system is put into service at selected National Weather Service office in the United States and exported to other countries.
  • 1975 – The Altair 8800, the world’s first home computer kit was introduced in the January edition of popular electronics
  • 1975-1976 NOAA Weather Radio network expanded from about 50 transmitters to 330 with a goal of reaching 70 percent of the populace with storm warning broadcasts.
  • 1977 – Radio Shack introduces a weather radio with built in automatic alerting that will sound off when the National Weather Service issues an alert on the new expanded NOAA Weather Radio network with over 100 stations. Page 145 here
  • 1977 – The Apple II, one of the first highly successful mass-produced home microcomputers was introduced.
  • 1978 – NOAA Weather Radio receivers with automatic audio insertion capabilities for radio and TV audio began to become widely installed.
  • 1979 – The first commercially automated cellular network (the 1G) was launched in Japan by NTT in 1979, initially in the metropolitan area of Tokyo. Within five years, the NTT network had been expanded to cover the whole population of Japan and became the first nationwide 1G network.
  • 1980 – Cable News Network (CNN) is founded by Ted Turner.Upon its launch, CNN was the first channel to provide 24-hour television news coverage, and the first all-news television channel in the United States.
  • 1980 –  A heatwave hit much of the United States, killing as many as 1,250 people in one of the deadliest heat waves in history.
  • 1981 – Home satellite dishes and receivers on C-band start to become widely available.
  • 1981 – The IBM Personal Computer aka IBM model number 5150, and was introduced on August 12, 1981, it set a standard for x86 systems still in use today.
  • 1982, May 2nd – The Weather Channel (TWC) is launched by John Coleman and Joe D’Aleo with 24 hour broadcasts of  computerized weather forecasts and weather-related news.
  • 1983 – Sony released the first consumer camcorder—the Betamovie BMC-100P
  • 1983 America Online (then as Control Video Corporation, Vienna, Virginia) debuts as a nationwide bulletin board system featuring email.
  • 1983 – The first 1G cellular telephone network launched in the USA was Chicago-based Ameritech using the Motorola DynaTAC mobile phone.
  • 1984 – The Apple Macintosh computer, with a built in graphical interface, was announced. The Macintosh was introduced by the now famous US$1.5 million Ridley Scott television commercial, “1984“. The commercial most notably aired during the third quarter of Super Bowl XVIII on 22 January 1984 and is now considered a “watershed event”.
  • 1985 – Panasonic, RCA, and Hitachi began producing camcorders that recorded to full-sized VHS cassette and offered up to 3 hours of record time. TV news soon began to have video of news and weather events submitted from members of the public.
  • 1986 July 18th, KARE-TV in Minneapolis dispatches a news helicopter to catch live video of a tornado in progress, live at 5:13 PM during their news broadcast.
  • 1988 – Doppler Radar goes national – the construction of a network consisting of 10 cm (4 in) wavelength radars, called NEXRAD or WSR-88D (Weather Service Radar 1988 Doppler), was started.
  • 1989 – Tim Berners-Lee and Robert Cailliau built the prototype system which became the World Wide Web at CERN
  • 1989 – August Sony announced the Sony ProMavica (Magnetic Video Camera) electronic still camera, considered the first widely available electronic camera able to load images to a computer via floppy disk.
  • 1991 – Anders Olsson transmits solitary waves through an optical fiber with a data rate of 32 billion bits per second.
  • 1991  – The 1991 Perfect Storm hits New England as a Category 1 hurricane and causes $1 billion dollars in damage. Covered widely in TV and print, it later becomes a movie starring George Clooney.
  • 1992 – Neil Papworth sends the first SMS (or text message).
  • 1992 – August 16-28 Hurricane Andrew, spotted at sea with weather satellites, is given nearly continuous coverage on CNN and other network news outlets as it approaches Florida. Live TV news via satellite coverage as well as some Internet coverage is offered. It was the first Category 5 hurricane imaged on NEXRAD.
  • 1993 – The Great Mississippi Flood was carried on network television as levees breached, millions of viewers watched the flood in real-time and near real-time.
  • 1994 – Internet2 organization created
  • 1994 – Home satellite service DirecTV launched on June 17th
  • 1994 – An initiative by Vice President Gore raised the NOAA Weather Radio warning coverage to 95 percent of the US populace.
  • 1995 – The Weather Underground website was launched
  • 1995 – DSL (Digital Subscriber Line) began to be implemented in the USA
  • 1996 – Home satellite service Dish Network launched on March 4th
  • 1996 – Fox News Channel was launched on October 7, 1996 with 24 hour news coverage
  • 1996 – The Movie “Twister” was released on May 10, showing the drama and science of severe weather chasing in the USA midwest.
  • 1999 – Dr. Kevin Trenberth posts a report and web essay titled The Extreme Weather Events of 1997 and 1998 citing “global greenhouse warming” as a cause. Trenberth recognizes “wider coverage” but dismisses it saying:   “While we are indeed exposed to more and ever-wider coverage of the weather, the nature of some of the records being broken suggests a deeper explanation: that real changes are under way.”
  • 2002 – Google News page was launched in March. It was later updated to so that users can request e-mail “alerts” on various keyword topics by subscribing to Google News Alerts.
  • 2004 – December: A freak snowstorm hits the southernmost parts of Texas and Louisiana, dumping snow into regions that do not normally witness winter snowfall during the hours leading up to December 25 in what is called the 2004 Christmas Eve Snowstorm.
  • 2004 – DSL began to become widely accepted in the USA, making broadband Internet connections affordable to most homes.
  • 2004 – On November 19, the Website “Real Climate” was introduced, backed by Fenton communications, to sell the idea of climate change from “real scientists”.
  • 2004 – December The website “Climate Audit” was launched.
  • 2005 – August, Hurricane Katrina caused catastrophic damage along the Gulf Coast of the United States, forcing the effective abandonment of southeastern Louisiana (including New Orleans) for up to 2 months and damaging oil wells that sent gas prices in the U.S. to an all-time record high. Katrina killed at least 1,836 people and caused at least $75 billion US in damages, making it one of the costliest natural disasters of all time. TV viewers worldwide watched the storm strike in real time, Internet coverage was also timely and widespread.
  • 2006 – Al Gore’s movie An Inconvenient Truth premiered at the 2006 Sundance Film Festival and opening in New York City and Los Angeles on May 24. It went on to limited theater release and home view DVD. It was the first entertainment film about global warming as a “crisis”, with hurricane Katrina prominently featured as “result” of global warming.
  • 2006 – The short instant message service Twitter was launched July 15, 2006
  • 2006 – November 17th, Watts Up With That was launched.
  • 2007 – The iPhone, with graphics and Twitter instant messaging capabilities was released on June 29, 2007.
  • 2007 – The reality show “Storm Chasers” debuts on the Discovery channel on October 17, 2007, showing severe weather pursuit as entertainment.
  • 2007 – On October 10th, in Dimmock v Secretary of State for Education and Skills Al Gore’s AIT movie was challenged in a UK court, and found to have nine factual errors. It was the first time “science as movie” had been legally challenged.
  • The 2008 Super Tuesday tornado outbreak was a deadly tornado outbreak affecting the Southern United States and the lower Ohio Valley from February 5 to February 6, 2008. With more than 80 confirmed tornados and 58 deaths, the outbreak was the deadliest in the U.S. since the May 31, 1985 outbreak that killed 76 across Ohio and Pennsylvania. It was widely covered live on US media.
  • 2010 – A heat wave in Russia was widely reported by global media as being directly a result of “global warming”. Scientific research from NOAA released later in 2010 and 2011 showed that to be a false claim.
  • 2011 – On January 4th, the Pew Research Center released a poll showing that Internet had surpassed television as the preferred source for news, especially among younger people.
  • 2011  – March, notice of an Earthquake off the coast of Japan was blogged near real-time thanks to a USGS email message alert before TV news media picked up the story, followed by A Tsunami warning. A Japanese TV news helicopter with live feed was dispatched and showed the Tsunami live as it approached the coast of Japan and hit the beaches. Carried by every major global news outlet lus live streamed on the Internet, it was the first time a Tsunami of this magnitude was seen live on global television before it impacted land.

Compare the reach and speed of communications and news reporting at the beginning of this timeline to the reach and speed of communications and news reporting technology around the beginning of the 20th century. Then compare that to the beginning of the 21st century. Compare again to what we’ve seen in the last 10 years.

With such global coverage, instant messaging, and Internet enabled phones with cameras now, is it any wonder that nothing related to severe weather or disaster escapes our notice any more? Certainly, without considering the technological change in our society, it would seem as if severe weather events and disasters are becoming much more frequent.

To borrow and modify a famous phrase from James Carville:

It’s the technology, stupid.

Which speaks to the phrase: “Despite common perception” which I highlighted at the beginning. The speed of weather tracking and communications technology curve aids in our “common perception” of severe weather events. The reality of severe weather frequency though, is actually different. While we may see more of it, that happens because there are millions more eyes, ears, cameras, and networks than ever before.

1. There are less Tornadoes in the USA

12-month running sums of hurricane frequency (Dr. Ryan N. Maue, FSU)

3. And now, back to our original seed for this long thread, no effect in global flooding events:

Destructive floods observed in the last decade all over the world have led to record high material damage. The conventional belief is that the increasing cost of floods is associated with increasing human development on flood plains (Pielke & Downton, 2000). However, the question remains as to whether or not the frequency and/or magnitude of flooding is also increasing and, if so, whether it is in response to climate variability and change.

Several scenarios of future climate indicate a likelihood of increased intense precipitation and flood hazard. However, observations to date provide no conclusive and general proof as to how climate change affects flood behaviour.

Finally, this parting note.

While our world has seen the explosion of TV news networks, Internet News websites. personal cameras and recording technology, smartphones with cameras, and the ability to submit a photo or movie or live video feed virtually anywhere, anytime, giving us reporting of weather and disaster instantly on the scene, where tornadoes live on TV is becoming a ho-hum event, there’s one set of elusive phenomena that still hasn’t seen an increase in credible reporting and documentation:

UFO’s, Loch Ness monster,  and Bigfoot.

We still haven’t seen anything credible from the millions of extra electronic eyes and ears out there, and people still marvel over old grainy images. You’d think if they were on the increase, we’d know about it. 😉

Crossposted from WattsUpWithThat. If you are a science buff, and a weather/climate buff especially, you should be visiting WUWT regularly, The world’s most widely-read climate site.

Stunning map of NOAA data showing 56 years of tornado tracks sheds light on the folly of linking “global warming” to severe weather
by Anthony Watts, WUWT

…has been turned into a stunning image of the United states. Each line represents an individual tornado, while the brightness of the line represents its intensity on the Fujita Scale. The result, rendered by John Nelson of the IDV User Experience, shows some interesting things, especially the timeline bargraph that goes with the map, which show that the majority of US tornado related deaths and injury (prior to the 2011 outbreak which isn’t in this dataset) happened in the 1950′s to the 1970′s. This is a testament to NEXRAD doppler radar, improved forecasting, and better warning systems combined with improved media coverage.

Here’s the data description, the big map of the CONUS follows below.

The National Weather Service (NWS) Storm Prediction Center (SPC) routinely collects reports of severe weather and compiles them with public access from the database called SeverePlot (Hart and Janish 1999) with a Graphic Information System (GIS). The composite SVRGIS information is made available to the public primarily in .zip files of approximately 50MB size. The files located at the access point contain track information regarding known tornados during the period 1950 to 2006. Although available to all, the data provided may be of particular value to weather professionals and students of meteorological sciences. An instructional manual is provided on how to build and develop a basic severe weather report GIS database in ArcGis and is located at the technical documentation site contained in this metadata catalog.

It is also worth noting that the distribution of strong tornadoes -vs- weaker tornadoes (rated by the Fujita scale) is greatly lopsided, with the weakest tornadoes far outnumbering the strong killer F5 tornadoes (such as we saw in 1974 and 2011, both cooler La Niña years) by at least an order of magnitude:

And here’s the entire map, click for a very hi-resolution version:

Mike Smith covers a lot of the history contained in this data set in his book Warnings The True Story of How Science Tamed the Weather.

He talks about the vast improvements we’ve witnessed since the early days of severe weather forecasting and is well worth a  read if you want to understand severe weather in the USA and how the detection and warning methods have evolved. He has another book just out (Reviewed by Pielke Sr. that explains the failure of this system in Joplin in 2011.

In Mike Smith’s first book, “Warnings: The True Story of How Science Tamed the Weather,” we learned the only thing separating American society from triple-digit fatalities from tornadoes, weather-related plane crashes, and hurricanes is the storm warning system that was carefully crafted over the last 50 years. That acclaimed book, as one reviewer put it, “made meteorologists the most unlikely heroes of recent literature.” But, what if the warning system failed to provide a clear, timely notice of a major storm? Tragically, that scenario played out in Joplin, Missouri, on May 22, 2011. As a wedding, a high school graduation, and shopping trips were in progress, an invisible monster storm was developing west of the city. When it arrived, many were caught unaware. One hundred sixty-one perished and one thousand were injured. “When the Sirens Were Silent” is the gripping story of the Joplin tornado. It recounts that horrible day with a goal of insuring this does not happen again.

Of course, alarmists like Peter Gleick (who knows little about operational meteorology and is prone to law-breaking) like to tell us severe weather (and days like Joplin) are a consequence of global warming saying at the Huffington Post:

“More extreme and violent climate is a direct consequence of human-caused climate change (whether or not we can determine if these particular tornado outbreaks were caused or worsened by climate change).”

But in this story from Physorg.com

“If you look at the past 60 years of data, the number of tornadoes is increasing significantly, but it’s agreed upon by the tornado community that it’s not a real increase,” said Grady Dixon, assistant professor of meteorology and climatology at Mississippi State University.

“It’s having to do with better (weather tracking) technology, more population, the fact that the population is better educated and more aware. So we’re seeing them more often,” Dixon said.

But he said it would be “a terrible mistake” to relate the up-tick to climate change.

Again, for a full understanding I urge readers to click, read, and to distribute these two WUWT essays:

The folly of linking tornado outbreaks to “climate change”

Why it seems that severe weather is “getting worse” when the data shows otherwise – a historical perspective

Contact

Donations welcome!

RSS Last Alert Issued:

  • GH-GTA Scan Zone Severe Weather Alert #ONStorm October 2, 2016
    SEVERE WEATHER ALERT — 01:35 PM EDT Oct 02 2016 This is an automated alert of potentially severe weather for the Golden Horseshoe/ Greater Toronto/Niagara Peninsula/South-Central Ontario Monitored Area, from Ephemerata Weather Radar. See attached scan image. The alert triggered at 01:35 PM EDT on Oct 02 2016, from radar data analyzed from NWS radar site KBUF […]

The Radar Page

S. Ontario Warnings

Click for current EC Warnings Map

Ephemerata Weather Radar
Standard: - display: active rain scan: Buffalo, Cleveland or Detroit short or Long range base or composite reflectivity. When the GH-GTA is quiet, other areas may be spotlighted.

Alerts Archive

Twitter Alerts

Error: Twitter did not respond. Please wait a few minutes and refresh this page.

EWR on twitter

Ephemerata Home

EWR Image Gallery

EWR Image Gallery Miscellaneous images taken from the various EWR focus topics.

Solar/Climate Conditions

Youtube

EWRadarProject on Youtube

Copyright Notice

All material, text, images, graphics and video, is ©2013 P. Coppin. All Rights Reserved. No reproduction by any means is permitted without explicit authorization.