Thursday, November 1, 2018

The IPCC Special Report

The Intergovernmental Panel on Climate Change (IPCC) was created by the United Nations in 1988, as climate scientists worldwide started to lobby for more action on global warming.  Every five or six years, the IPCC releases an Assessment Report consisting of three 1000-page books presenting the present state of the physical science, the impacts and how to adapt to them, and how to mitigate the impacts.  The fifth and most recent Assessment Report was completed in 2014.  The IPCC does not fund climate research, but it does report on papers that have been published and summarizes the results in an organized manner so that the general public can access the information.  The Paris Agreement in 2015 commissioned the IPCC to look at the effects of a world where global warming reached 1.5ºC (2.7ºF) above pre-industrial levels, and how different the world would be if the warming stopped at 2ºC (3.6ºF) instead.  On October 6, the IPCC released a special report called Global Warming of 1.5 °C: an IPCC special report on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty.  As with all the IPCC reports, it was accompanied by a summary for policymakers, which I am summarizing here.  Policymakers in democracies are ultimately answerable to voters, of course, so it is important that voters understand the general conclusions as well.

The report begins by stating that “Human activities are estimated to have caused approximately 1.0°C of global warming above pre-industrial levels, with a likely range of 0.8°C to 1.2°C. Global warming is likely to reach 1.5°C between 2030 and 2052 if it continues to increase at the current rate.”  This estimate is comparable to what has been published in the relatively recent past.  For example, a 2017 paper written by a team led by James Hansen showed that the rate of temperature increase, using an eleven-year running mean to smooth out natural variability, is essentially linear since 1970 at a value of 0.18ºC per decade.  A continuation of this rate would put the world over the 1.5ºC threshold in the 2040s and over 2.5ºC by the end of the century, but as Hansen further explained this past month, there are reasons to think the rate of warming will accelerate in the meantime.  Why do these numbers matter?  The effects of continued warming are potentially numerous.  While some of these remain a subject of ongoing scientific debate, we can talk about rising sea levels with a high amount of confidence because we know where these sea levels were the last time the Earth experienced similar temperatures.  The last time the Earth was as warm as it is now, for example, was during the Eemian interglacial Period from 130,000 to 115,000 years ago.  Sea levels then rose 6 to 9 meters, or about 20 to 30 feet, higher than they are now.  I live on Long Island, to the east of New York City.  There would not be much of Long Island left if the oceans rose that high.  But if the temperature increase does accelerate without being checked, temperatures could reach 3ºC (5.4ºF) above pre-industrial levels by 2100.  The last time temperatures were this high was during the Pliocene Epoch, about three million years ago, and sea levels were about 25 m (80 feet) higher then. Sea levels would take quite a bit of time (centuries at least) to rise by that much, but if warmer temperatures are sustained for a prolonged period of time, the rise will become progressively harder to stop.  And given the number of people on the world who live close to a coast, the degree of upheaval will be massive.

But is there still some hope of keeping temperatures from rising above 1.5ºC, or at least 2ºC?  According to the recent report, “warming from anthropogenic emissions from the pre-industrial period to the present will persist for centuries to millennia and will continue to cause further long-term changes in the climate system, such as sea level rise, with associated impacts (high confidence), but these emissions alone are unlikely to cause global warming of 1.5°C (medium confidence).”  Some degree of continued warming is inevitable regardless of how quickly carbon dioxide emissions are reduced, mainly because carbon dioxide has a half-life in the atmosphere of about fifty years.  In other words, half the carbon dioxide emitted in 1968 is still in the air today.  So it will take time to clean up the atmosphere fully even in the best case scenario.  But the best case scenario can still keep warming under 1.5ºC — provided that the people of the world act with a sense of urgency.

The report discusses a series of emissions reduction scenarios, along with their likelihood of getting global temperatures to stabilize with warming below 1.5ºC by 2100.  The presented scenario that keeps warming under the Paris Agreement’s preferred limit involves reducing global carbon dioxide emissions to a net of zero before 2055, while starting to aggressively reduce the warming from other sources by 2030.  Black carbon (soot) aerosols and gases like methane do not contribute as much as carbon dioxide to global warming, but they still contribute a substantial amount and can not be overlooked.

What kind of sea level rise can we expect to see in the short term?  “By 2100, global mean sea level rise is projected to be around 0.1 meter lower with global warming of 1.5°C compared to 2°C (medium confidence). Sea level will continue to rise well beyond 2100 (high confidence), and the magnitude and rate of this rise depends on future emission pathways. A slower rate of sea level rise enables greater opportunities for adaptation in the human and ecological systems of small islands, low-lying coastal areas and deltas (medium confidence).”  The error bars are substantial with sea level rise, but two points need to be made.  The first is that some degree of rise is inevitable, as the ice sheets are still adjusting to the warming that has already happened.  The second is that the amount of sea level rise will ultimately depend on the temperature at which the Earth stabilizes, and the time it takes to get to stabilization.

What other changes besides sea level can we expect to see in a warming world, and how much difference will limiting the warming to 1.5ºC make compared to a warming of 2ºC?  The report states that “climate models project robust differences in regional climate characteristics between present-day and global warming of 1.5°C, and between 1.5°C and 2°C. These differences include increases in: mean temperature in most land and ocean regions (high confidence), hot extremes in most inhabited regions (high confidence), heavy precipitation in several regions (medium confidence), and the probability of drought and precipitation deficits in some regions (medium confidence).”  This point deals with changes that are already being observed, including in the northeastern United States: the entire northeast has seen an increase in precipitation over the last 30 years relative to the first half of the 1900s, and there are large stretches of New England where the increase exceeds 10%.  The physical explanation for this is that higher temperatures mean more water vapor gets evaporated, and more water vapor going up means more precipitation coming down.  These trends will strengthen as the Earth gets warmer, but can be mitigated if the temperature is stabilized.  Other things that will increase at a temperature increase of 1.5ºC, and even more at 2.0ºC, include impacts on biodiversity and ecosystems, ocean temperature and acidity, and climate-related risks to health, livelihoods, food security, water supply, human security, and economic growth.  And the difficulty of adapting to the changes, naturally, will increase as well.

One graph in the report talks about the risks that are involved to “people, economies and ecosystems across sectors and regions” as the world warms.  Some risks are closer than others; for example, the risk of severe impacts and risks to warm water corals is already high and almost certain to become very high.  The risk of coastal flooding already becomes high with 1.5ºC warming, and the risk of major flooding from rivers becomes high with a warming of 2ºC.  The risk of severe damage to crop yields and heat-related mortality also becomes high at a warming of 2ºC.

While the IPCC acknowledges that any pathway to keeping warming below 1.5ºC will require swift and definitive action, the report proposes several different pathways that are possible.  One involves major innovations in efficiency that enable rapid decarbonization and make it possible to reduce carbon dioxide levels in the atmosphere simply by adding trees.  The second involves a general change in people’s consumption patterns, along with technological innovations in artificial means of removing carbon from the atmosphere.    The third is a combination of the two, and the fourth is a high initial overshoot of 1.5ºC that is overcome by technological innovations in carbon removal.  “Avoiding overshoot and reliance on future large-scale deployment of carbon dioxide removal (CDR) can only be achieved if global CO2 emissions start to decline well before 2030 (high confidence).”  As Benjamin Franklin put it, a stitch in time saves nine.  The sooner we act to control temperatures, the easier it will be.  And it is also much cheaper to plant a tree than to invest heavily in new technology and hope it quickly reaches a point where it can work on a global scale.  

The catch in all this, of course, is that forests require land, cheap solar and wind power require land (at least at present), and food for a growing population rapidly approaching eight billion people requires land.  The world will need to walk a very delicate tightrope.  The task is daunting, and will require creative thinking on the part of many people across the globe.  It is still very possible, though, given sufficient will.

(For updates on new posts, please click the "Follow" button.)



Wednesday, September 5, 2018

The debate over Antarctic Ice

Scientists know with plenty of confidence what is happening to the average temperature at the Earth’s surface, but the effects of global warming at specific locations are not always understood as clearly.  One location that continues to be a subject of extensive research and considerable debate is the continent of Antarctica.  You might expect that a large landmass covered with ice would be experiencing uniformly large losses of ice in a warming world, but the situation is more complicated and nuanced than that.  The most recent (2013) report of the Intergovernmental Panel on Climate Change (IPCC), the international body that reports on the state of climate science to world governments and to the public, concluded with high confidence that the Antarctic Ice Sheet on the whole has been losing ice.  The ice loss comes primarily from the Northern Antarctic Peninsula and the Amundsen Sea sector of West Antarctica (see Figure 1, taken from the IPCC report Climate Change 2013: The Physical Science Basis).  As a consequence, the lost ice is contributing to an acceleration in the rise of seas levels globally.  Note, however, that much of the land mass of Antarctica appears to be gaining some ice, and some sections of the continent are not presently warming.  This conclusion represented the majority of the studies that measured ice changes in Antarctica, but a 2015 study based on satellite data drew a very different conclusion.  The debate over the difference in observed results continues three years later, but a pair of significant papers that came out in June might point toward a resolution.


Figure 1.  West Antarctica is the left side of this illustration, with the Northern Antarctic Peninsula sticking up on the top left.

The major disruption to the consensus opinion of the Antarctic Ice Sheets came from a paper by a research team based at the NASA Goddard Space Flight Center (NASA/GSFC) in Greenbelt, MD.  Led by Jay Zwally, who in 2002 had published a significant paper about the disintegration of the Greenland Ice Sheet, the team looked at data that appeared to show that that Antarctic Ice Sheet was in fact gaining more mass than it was losing.  Much of the data for the study came from the Ice, Cloud, and Land Elevation Satellite (ICESat), a NASA satellite that operated between 2003 and 2009 and used the reflection of laser light to measure changes in height on the Earth’s surface, but other satellite data were used as well.  The data indicated that the height of the ice sheets is rising, not falling, and Zwally and his team concluded from this that the Antarctic Ice Sheets were experiencing a net gain in mass.  This conclusion caused a bit of a stir, to say the least.  Many climate change skeptics posted articles like this one about the result, taking it as proof that global warming was not as bad as “alarmists” in the IPCC were suggesting.  Zwally himself objected to his team’s research being used in such a matter.  At any rate, if the conclusions of the paper held up, it would not change the rate of observed sea level rise; it would, as commentator Jonathan Bamber from the University of Bristol pointed out in a guest post for the RealClimate blog, “make closing the sea level budget a whole lot harder (that is, making the sum of the sinks and sources match the observed rate of sea level rise).”  Ultimately, what that meant was that more research was necessary to understand what caused the discrepancies between what the plurailty of research papers cited in the IPCC had concluded and what Zwally and his team determined from their satellite data.

In their June 14 issue this year, Nature (the UK’s leading weekly magazine on news and major findings within the scientific community) presented a series of papers on Antarctica.  Included in this series was “Mass balance of the Antarctic Ice Sheet from 1992 to 2017,” written collaboratively by a very large number of scientists working under the banner of the Ice sheet Mass Balance Inter-comparison Exercise (IMBIE).  The IMBIE team compiled 24 different estimates of ice sheet balance.  Seven used satellite measurements of surface height, like the Zwally study did, while fifteen used measurements of the strength of the Earth’s gravity at different locations.  The remaining two used the input-output method, where measurements of incoming snowfall are compared against the size of icebergs that have broken off.  The consensus conclusion was that Antarctica has lost ice after all, causing a sea level rise of 7.6 ± 3.9 millimeters globally over the last 25 years.  The findings of the Zwally paper were acknowledged in the appendix, but it was clear that the Zwally paper is an outlier compared to the other studies.

But if the Zwally study’s status as an outlier comes from something that makes its conclusions inaccurate, it still needs to be determined what the source of that inaccuracy may be.  The week after the IMBIE paper was published in Nature, Science (a major weekly publication like Nature, put out by the American Association for the Advancement of Science) published an article from a research team lead by Valentina Barletta of the Technical University of Denmark that looked at how bedrock in West Antartica rises as the weight of the ice above it diminishes due to melting.  (The weight of the ice puts enough pressure on the underlying rock to make it compress, and removing some of the ice causes the rock to decompress.  Both present-day ice melting, and past melting of ice as the Earth emerged from the last Ice Age, can contribute to present-day decompression.)  The team put GPS devices at the top of the bedrock at six different stations in West Antarctica, and found that the bedrock was rising at an average rate of 41 mm per year — much higher than had been anticipated.  This finding suggests that ice mass loss in the West Antarctic Ice Sheet to date has been underestimated.  But on a more optimistic note, it also suggests that the sheet is more stable than previous thought, and less likely to shed a vary large amount of ice into the world’s oceans in a quick but catastrophic collapse.

So how do the two papers tie in to each other?  Zwally’s paper used a series of different values for the rate of bedrock rise, obtained from previous studies at different locations in Antarctica.  For West Antarctica as a whole, the rate of rise cited by Zwally was 26 mm per year.  If the Baretta study’s findings are correct, then the Zwally paper underestimated the bedrock rise by 15 mm/year.  Consequently, in order for the ice height measurements to be correct, an additional 15 mm per year of ice would be melting (or not accumulating in the first place) in West Antarctica.  Now there are some caveats here.  The Baretta study only examined a narrow part of Antarctica, not the continent as a whole.  Further studies on bedrock rise in the rest of the continent are likely forthcoming, however, and if they show that the Zwally study used estimates for the bedrock rise that are too low across the whole of Antarctica, then the conclusions of that paper can be reconciled with other research without needing to explain an error in either the ice height measurements from ICESat or any of the other data that resulted in different conclusions.

The study of the ice sheets in Antarctica is ongoing.  While apparent discrepancies in the findings of different research groups using different methods cannot presently be definitively explained, there is reason to hope that this puzzle will be solved in the relatively near future.

(For updates on new posts, please click the "Follow" button.)

Saturday, June 23, 2018

Dr. Hansen Goes to Washington, Thirty Years Later

On June 23, 1988, Dr. James Hansen testified before a Senate Committee in Washington that a recent rising trend in global mean surface temperatures had exceeded the point where natural causes could adequately explain it, and was instead the result of a buildup of carbon dioxide in the atmosphere due to human activity.  Hansen was the director of the Goddard Institute for Space Studies (GISS), a NASA facility situated in New York City.  GISS had been founded in 1961 for the initial purpose of studying planetary atmospheres, and like most of the scientists who worked at GISS in the eighties, Hansen’s background was in planetary science.  He co-wrote one of the definitive works on the scattering of radiation in planetary atmospheres, and he was one of the first scientists to realize that Venus once had oceans but lost them to a runaway greenhouse effect.  His understanding of the greenhouse effect, coupled with the knowledge that levels of carbon dioxide were rising on Earth, eventually led him to focus his research on our own planet.  But Hansen didn’t go to Washington to present a brand new scientific breakthrough; the paper that he and his team at GISS were working on at the time extended the work published in previous papers on the present and future state of the Earth’s climate and the role of feedbacks in climate change.  Instead, he went to Washington to tell our government that action was required to prevent the worst consequences of global warming, including rising sea levels and more frequent droughts, from happening.  This had the effect of turning global warming into a political issue, and as such it has, despite clear scientific evidence of its reality, remained controversial for three decades.  

The paper that Hansen and the GISS team were preparing to publish at the time of Hansen’s trip to Washington is titled “Global Climate Changes as Forecast by Goddard Institute for Space Studies Three-Dimensional Model.”  (It would be published in the Journal of Geophysical Research in August.)  As the title suggests, the paper used a computer model to simulate temperatures over a hundred-year period, based on projected increases in carbon dioxide and other “greenhouse gases.”   Other properties not known to influence or be influenced by temperature were held fixed.  The simulations were done for three different scenarios.  To quote from the paper: “Scenario A assumed that growth rates of trace gas emissions typical of the 1970s and 1980s will continue indefinitely; the assumed annual growth averages about 1.5% of current emissions, so that the net greenhouse forcing increases exponentially.  Scenario B has decreasing trace gas growth rates, such that the annual increase of the greenhouse gas climate forcing remains approximately constant at the present level.  Scenario C reduces trace gas growth between 1990 and 2000 such that the greenhouse climate forcing ceases to increase after 2000.”  The paper goes on to say that “Scenario B is perhaps the most plausible of the three cases,” on the grounds that resource limitations and environmental concerns would make the growth rate of Scenario A increasingly hard to sustain.  And indeed, an assessment published last year showed that Scenario B would have levels of carbon dioxide in the atmosphere of 401 parts per million (ppm) in 2016, close to the real value of 404 ppm.  The scenario did not anticipate the strong reduction in ozone-depleting gases that also add to the greenhouse effect, so the overall imbalance in the planet’s energy budget was overestimated by about 10%.



Figure 1

Figure 1 (taken directly from the 1988 paper) shows the modeled forecast of how global temperatures would evolve, and Figure 2 shows the actual result from the GISS temperature record.  Scenario B predicts a warming of a little over 1.1ºC relative to the 1951-1980 mean by 2020.  A recent analysis of the GISS temperature record by Hansen and several collaborators (he retired from GISS in 2013) indicates that an 11-year running mean of the temperature data since 1970 is remarkably linear; given a trend of 0.18ºC per decade, the actual temperature rise relative to the 1951-1980 mean is approaching 0.9ºC.  So the model projection somewhat overestimated reality over the last thirty years.  The GISS model used in 1988 had a higher sensitivity of global temperatures to greenhouse gas concentrations than most current models do, but of course current models have the benefit of thirty more years of data.

Figure 2

One aspect of the model’s predictions that often gets overlooked is the natural variability in global mean temperature.  Scenario B clearly does not show a steadily increasing curve, but an alternating series of periods of rapidly rising temperature and periods of relative stability.  The paper included a caveat that the variability in the model slightly underestimated the real, observed natural variability.  I have commented in a previous blog post that the real temperature record looks more like a stepladder than a smooth curve, and that this is not out of line at all with what models have predicted.  The model suggests that this stepladder pattern will continue at least into the next decade, and I see no reason to think it won’t.  People will suggest, as was done during an extended period of relative stability in global mean temperatures during the decade of the 2000s, that global warming has either stopped or gone on “hiatus.”  Do not be fooled by that.


Thirty years ago today, the term “global warming” entered the public consciousness.  It was already well-known to climate scientists, having been coined by Wallace Broecker in a seminal 1975 paper.  The theory behind it was well understood, and by the late 1980s its signal could be clearly seen in the temperature record.  That signal has only gotten stronger, as Jim Hansen and many other climate scientists have predicted it would.  But once Hansen went to Washington and said that something needs to be done about it, global warming became a political issue at least as much as a scientific one.  By the time I first went to GISS as a graduate student in 1996, Hansen had become fed up with the inertia in the political system; he felt that people in Washington simply weren’t listening.  But if he was content at the time to go back to doing the science and communicating the results, that had changed by 2006 when the government started to interfere with his ability to discuss his results publicly.  Since then he has remained very active on both the political and scientific fronts, and he currently heads the Climate Science, Awareness, and Solutions Program at Columbia University’s Earth Institute.  And much more work needs to be done.  The scientific debate on global warming was already more or less resolved by 1988, but thirty years later, politicians (especially here in the United States) are still too bogged down debating whether it is happening to get down to actually doing something about it.  In the meantime, the planet has continued to warm.

(For updates on new posts, please click the "Follow" button.)

Friday, March 30, 2018

Heat waves in the future

When people talk about the consequences of global warming, the first thing that usually gets discussed is rising sea levels.  However, the rising temperatures come with some significant direct consequences as well.  Rising temperatures by themselves will lead to an increase in dangerous heatwaves, but they will also allow more water to remain in the vapor phase in the air.  This increased humidity will make the human body’s removal of heat by perspiration more difficult.  The National Weather Service (NWS) quantifies this using a measure called the heat index.  A recent paper written by a research team led by UK scientist Tom Matthews used a modeling study to analyze changes in heat index as temperature changes, and came to the unfortunate conclusion that heat index will increase more quickly than temperature.  Second, due largely to the urban heat island effect, cities will warm up faster than their surrounding areas.  This puts large groups of people at a heightened risk of exposure to dangerous heat.  A paper from this past month, written by a team led by UC-Irvine professor Simon Papalexiou, uses the existing temperature record to show that the hottest temperature of the year at different locations has been increasing overall at a slightly higher rate than the annual mean temperature, but this rate increases dramatically for many cities.  Taken together, these papers show an increasingly serious vulnerability of the world’s major urban centers to disruptive and dangerous hot spells.

The NWS defines an excessive heat warning when the heat index, not the temperature itself, exceeds 105ºF for at least two days.  (The Papalexiou paper identifies this number as 115ºF, but that is a misprint.)  The heat index combines temperature with humidity, which is the amount of water vapor in the atmosphere.  The more water vapor in the air at a given temperature, the harder it is for the human body to shed heat via perspiration.  In other words, you will have a harder time cooling down on a high-humidity day than you will on a day with the same temperature but low humidity.  The heat index then is essentially the temperature that dry air would need to be to make the body’s potential for overheating equivalently likely to the actual weather conditions.  The amount of water vapor the atmosphere holds increases exponentially with temperature, which means that a general increase in temperature will produce an even greater increase in the heat index.  Matthews and his colleagues used a computer simulation of how heat index varies with temperature to predict that the global heat stress burden — defined as the surface area over which the heat index exceeds the 105ºF threshold, multiplied by the number of days per year that area exceeds the threshold and the number of people within the area— will rise at a rate that is three times larger than the rise in global mean temperature.  Under a temperature rise of 1.5ºC (2.7ºF) relative to pre-industrial temperatures (the recommended limit adopted by the Paris Agreement), the heat stress burden will be nearly 6 times greater than it was during the reference period from 1979-2005.  At 2ºC warming, the heat stress burden will increase by a factor of 12.  

The cities of Karachi, Pakistan (with a population of nearly 15 million) and Kolkata (formerly Calcutta), India (with a metropolitan area that contains 14 million people) suffered under a killer heat wave in 2015 where temperatures exceeded their warmest values in at least 36 years.  The Matthews paper predicts that under 1.5ºC warming, Kolkata would see one day a year on average with a heat index as high as what was experienced in 2015, and Karachi would see one such day every 3.7 years.  A number of major cities have the potential for large effects, but none more so than Lagos, Nigeria; its heat stress burden has the potential to be increased by a factor of 1000, given both rising temperatures and likely population increases, just from a global temperature rise of 1.5ºC.  Furthermore, a rise of 1.5ºC will expose 350 million more people globally to heat stress at least once a year.

Papalexiou and his team used data from 8848 stations in the Global Historical Climatology Network — the same temperature database on which NASA and NOAA base their temperature records — to assess trends in the hottest temperature of the year in a given region, along with their statistical significance.  They found that 80% of the land area with sufficient data has a positive trend in the hottest temperature of the year over the last half century, with nearly half of that area having a linear trend in excess of 0.20ºC per decade.  The global mean temperature over that time has increased at a rate of about 0.18ºC per decade, but globally the mean hottest temperature of the year at a given location has increased by about 0.25ºC per decade.  The largest area of significant increase extends across central and eastern Europe, although some other parts of the world show large increases too.

The study becomes most unsettling when it discusses changes in the hottest temperature of the year in large cities.  For example, over the last fifty years, the hottest temperature of the year in the city of Paris has risen by an average of 0.96ºC per decade.  Over the past thirty years, Houston’s hottest temperature of the year has risen by 0.99ºC per decade.  The primary cause of the large rise in temperatures in cities relative to warming globally is the urban heat island effect; cities absorb more radiation than the countryside does, and are warmer as a result.  This effect needs to be carefully accounted for when assessing the warming of the globe as a whole, but more than half the world’s population lives in urban areas. 

To summarize, even the optimistic aim of the Paris Agreement to stabilize temperatures at 1.5ºC will not spare millions of people from increased exposure to dangerous heat.  The combination of global warming, rising populations in urban areas, and the extra heat that urban infrastructure absorbs will magnify the stress on tropical and subtropical megacities over the coming century, with dangerous consequences for the people that live there.  And if the warming continues beyond 1.5ºC, the heat stress will become common in more temperate areas as well.

(For updates on new posts, please click the "Follow" button.)

Thursday, February 15, 2018

The Global Temperature Records

Every January, the global mean temperatures for the previous year are announced.  Four institutions worldwide maintain running temperature records, and sometimes they draw slightly different conclusions.  That was true in 2017, as one institution called it the second warmest year on record while the other three called it the third.  (Last year, I had predicted it would be the third warmest year.)  So who are these four institutions, and why don’t their records produce uniform agreement?

Two of the four institutions that maintain temperature records, NASA’s Goddard Institute for Space Studies (GISS) in New York and the National Oceanographic and Atmospheric Administration (NOAA, headquartered in Silver Spring, Maryland) are U. S. government agencies.  The other two agencies are the British Meteorological Office and the Japan Meteorological Agency.  These records, based primarily on thermometer readings from many weather stations and ships worldwide, enable analysis of local temperature changes over time as well as changes in the global mean temperature.  When talking about global warming in general terms, though, the annual mean variations in global mean temperature are usually presented.  NASA GISS reported a global mean temperature anomaly of 0.90ºC (1.62ºF) relative to the mean 1951-1980 temperature, their second largest anomaly behind 2016.  NOAA had a result of 0.82ºC, the Met Office 0.70ºC, and Japan 0.73ºC, when normalized to their 1951-1980 means.  These three agencies had 2017 third in temperature, between 2016 and 2015.  The reason these are different has to do with the methodologies that the four organizations use.  A detailed explanation of the different methodologies can be found here. I will summarize the NASA GISS methodology by saying that it is more aggressive in counting ocean temperatures in the northern part of the world, where direct measurements are scarce.  The rationale behind this approach is that this part of the world has been the most sensitive to the influence of global warming so far.  This also explains why the GISS record tends to have higher numbers for the global mean temperatures than the other records do.

You might also be wondering why temperatures are expressed in terms of an anomaly, rather than a specific value for the temperature.  The reason for that is that it is easier to measure the change in temperature at a specific location, after accounting for factors like the urban heat island effect or proximity to ship engines that might skew the specific value of the temperature, than it is to define the specific value of temperature at a site unambiguously over time.  Furthermore, in regions where coverage is sparse, the average change in measured temperature is more likely to accurately represent that of the full region than the average measured temperature itself.
Figure 1

Figure 1 shows the plot, from 1890 to the present, of global mean temperature anomalies as determined by all four agencies normalized to their mean values between 1951 and 1980.  (GISS is black, NOAA is red, the Met Office is green, and Japan is blue.)  While they don’t agree on the exact numbers, they certainly do reflect the variations in temperature very similarly, in terms of both year-to-year shifts and the underlying long-term trend.


Regardless of which data set you look at, a very important conclusion can be drawn.  As 2017 was not significantly influenced by an El Niño event like 2015 and 2016 were, it is the warmest year on record without such an influence.  As I’ve discussed in a previous blog entry, the El Niño/La Niña cycle is the primary driver of natural variability in the temperature record.  So keeping natural variability in mind, the sharp short-term surge in global mean temperatures that became noticeable in 2014 is continuing.

(For updates on new posts, please click the "Follow" button.)

Tuesday, January 16, 2018

When Other Things Take Precedence, Again


Much like when I posted about the tension in Charlottesville back in August, I want to discuss something here that isn’t directly about energy or global warming, but did dominate the news throughout this past fall and ultimately affected a class I taught in the fall semester.  I am talking about the series of sexual assault and harassment scandals that started with movie producer Harvey Weinstein and spread to a number of people in entertainment and politics.  Up to this point, I have mostly refrained from talking about this particular issue.  I could sense from the beginning that it would wind up involving people that have done things that I have admired, and I wanted to take some time to think about it before weighing in.  I think it is worth bringing it up in this blog because the most prominent political figure affected by this scandal, Minnesota senator Al Franken, has been a very eloquent speaker on the issue of global warming.  Franken, of course, resigned his position after a very damning photograph came out showing him engaged in a thoughtless prank directed at a sleeping female comedian, followed by a number of other credible women accusing him of inappropriate behavior such as forcibly kissing them.

I know that plenty of people feel that Franken has been denied his right to due process, but I want to start by saying that I don't believe this is ultimately about whether Franken is innocent (he's not) or whether he has committed any action that is irredeemably bad or criminal (he hasn’t).  The issue to me is whether or not Franken has lost the authority to speak effectively as a statesman on the issues of our time.  As this is a blog about global warming, I want to present as a case in point a discussion Franken had on the Senate floor this past June with the United States Secretary of Energy, Rick Perry.



I teach a class on Energy and the Environment at St. Joseph's College in Patchogue, New York.  When I saw this clip I had every intention of showing it to my class at the end of the fall semester, when I discuss how to respond to skeptical arguments against the idea that global warming is happening and human activity is responsible for it.  The clip demonstrates very well that it's important to understand how the scientific method works, and that you don't need to be a scientist to be able to do that.  (Keep in mind that before he became a senator, Al Franken made a living writing jokes for Saturday Night Live.)  But as you can probably guess, I couldn't show this to my students.  Franken’s behavior in other regards had become too much of a distraction, and presenting him as a champion of good sense no longer seemed prudent.  I was angry and disappointed, but it was not the fault of Franken’s political opponents, or his female Democratic colleagues in the Senate who collectively asked him to resign, that this happened.  And it was certainly not the fault of the women who have spoken out about Franken’s behavior towards them.  Regardless of how well he may have spoken on a variety of different issues, Al Franken forfeited his voice.  He did not have it taken away from him.

But the important detail that I had wanted to emphasize from this exchange was not that Franken possessed any special, irreplaceable gift for speaking about global warming or other issues.  In fact, the opposite is true.  For a person whose education has lasted at least as far as high school, it takes as much time to understand how the scientific method works as Franken spent explaining it to Secretary Perry.  This isn’t difficult.  Scientists critically evaluate each other’s work all the time, just as they have been doing for centuries.  What holds up to scrutiny is preserved, and what doesn’t is disregarded.  The system isn’t necessarily perfect, but Perry recommended in this clip that the best response to research that has survived decades of scientific scrutiny and led to an uncomfortable conclusion is not to act on the uncomfortable conclusion, but to subject it to more scrutiny (and in a highly subjective setting at that).  No, Secretary Perry, there is nothing whatsoever that is wrong with being a skeptic.  But truly being skeptical requires not only demanding evidence and critically evaluating it, but accepting when the evidence has indeed withstood the scrutiny.  The scientific process is based on healthy skepticism, and good science endures because of it.  This is something that everybody can understand well enough to defend it.

As the smoke starts to clear from this ordeal, there are things to hope for.  I do hope that Al Franken gets the opportunity to redeem himself in the not-too-distant future.  I also hope all people can learn to speak as well in defense of the scientific method, and the difficult conclusions it sometimes leads to, as Franken did in his exchange with Rick Perry.  It's necessary, and it’s actually not very hard.

(For updates on new posts, please click the "Follow" button.)

Tuesday, January 9, 2018

Last Year's Predictions

With 2017 officially over, I thought I would look back at a couple of predictions I made regarding our current administration’s energy policy and the state of our climate, and see how they played out.  Prediction is an important part not only of science but of analysis in general.  If the insights are solid, they should hold up not only to scrutiny at the time they are made but also to the tests that time puts them through.  Sometimes the unexpected still happens, of course, in which case you re-evaluate your previous opinions and beliefs and learn from what you got wrong.

I’ll start with a Facebook post I made the week after the 2016 election, based on a conversation I had with a colleague:

“One of the physics professors at Adelphi asked me how I thought the election would affect our energy production (and subsequently our CO2 emissions). I have a few thoughts that I figured I'd share.

1. For economic reasons, the coal industry has shrunk over the last decade. For economic reasons, the coal industry will not have a resurgence (all statements to the contrary by the President-elect notwithstanding).

2. The natural gas industry will meet with a friendly reception. Not getting into the debate on fracking for the time being (like it or not, there will be a lot of it), methane burns cleaner than other fossil fuels but it causes a significant climate impact when released to the air directly. There is plenty of scientific data at this point to suggest that the EPA's current methane emissions estimates -- and keep in mind, this is Obama's EPA -- are too low by close to a factor of two. This is a problem for all sources of methane, including the gas industry. A big issue that needs to be addressed is the combination of a surge in production with some seriously old infrastructure. (I explained it to my students this way: More than half the existing gas pipelines are older than I am, and as much as I hate to admit it, I'm not that young anymore.) Trump says he's all about infrastructure. We'll see.

3. Renewables will get a less friendly reception, but the good news is that they are in a position to survive that -- utility-scale solar is already cost-competitive with coal and gas. And if storage batteries can be made cheaply, they may even thrive.

4. Where renewables will face the most trouble is government-funded research. In September, I had the great pleasure of taking my class to Brookhaven Laboratory to see the solar-related projects there. I worry that projects like that are very vulnerable now.

5. The status quo will hold where CO2 emissions are concerned. While basic reality dictates that we cut our emissions with a sense of urgency, and that won't happen, CO2 emissions have actually dropped a little in recent years and I don't see them going back up.”

Regarding the first point, the economic health of the coal industry has not improved any in the past year.   In fact, according to Lazard's most recent analysis of the levelized cost of energy, the cost of coal has remained steady over the past year while the costs of natural gas, utility-scale solar, and wind have all dropped.  So coal’s foothold in the energy sector is indeed getting more and more tenuous.

I was a bit off on the second point, however.  I would have thought that anybody interested in defending fossil fuels would sing the praises of natural gas very loudly, given that it is cleaner than coal (it emits about half as much carbon dioxide per unit of energy released) and that it is presently cheaper than coal with a widening gap in price between the two.  But that is not what happened.  Instead, a Notion of Proposed Rulemaking submitted by the Department of Energy in September suggested offering tax breaks to power plants that could maintain 90 days worth of fuel on-site.  This proposal attempted to tilt the energy market away from natural gas plants, whose fuel supplies are generally piped in as needed, and towards coal and nuclear plants.  Just yesterday, though, the Federal Energy Regulatory Commission rejected the proposal; it makes bad economic sense, in addition to making terrible environmental sense.  Lobbying for the coal industry does score political points in states like Pennsylvania and Ohio that were critical to Trump’s electoral victory in 2016, however, so at this point I would expect the President to make similar attempts to prop up coal in the near future.  Moving ahead to the last sentence in my second point, Trump has yet to make a serious move on infrastructure.  So I guess we'll still see.

The third point was pretty accurate on the whole.  Utility-scale scale solar and wind remain competitive, and their costs continue to drop.  The price of utility-scale solar with battery storage dropped substantially, from $92 per megawatt-hour to $82 per megawatt hour.  Ultimately the battery storage is necessary for renewables to overcome the obstacle of intermittent generation (meaning you don't get electricity from a solar panel when there is no sun or from a windmill when there is no wind).  A fully clean energy sector simply cannot happen without it.  There are no obvious policy obstacles standing in the way of renewables right now; in fact, the only real obstacle is the price of natural gas.

Thankfully, there has been no major push to cut funding for renewables-related research to date.  I do still worry, though, that those projects are vulnerable.

I should have specified in my fifth point whether I was talking about global emissions of carbon dioxide, or specifically American emissions.  According to the recent estimates of the Global Carbon Project, American emissions in 2017 dropped slightly by approximately -0.4%.  Global carbon dioxide emissions have unfortunately gone up, however, primarily due to an uptick in the Chinese economy.  This underscores the need for more urgent and aggressive action across the board in reducing carbon emissions.  It’s pretty clear that our current administration won’t lead that charge, but some states are stepping in to fill the void.

The next prediction I would like to talk about comes from the blog post I made titled Breaking the “Icy Silence.”  The post discusses the drop in temperatures in the later part of 2016 that corresponded with the end of the very strong El Niño event that contributed to three straight years of record warmth.

“As for the drop, the 2015-2016 El Niño has certainly ended, but the present state is closer to neutral than to a full-blown La Niña event.  This suggests that ENSO-neutral conditions presently result in a temperature anomaly at, or maybe a little bit above, 0.80ºC.  Were this state of general neutrality to continue for the rest of the year, 2017 would wind up comfortably being the third warmest on record, but that ultimately depends on whether or not a strong La Niña ultimately happens.”

A strong La Niña event did not materialize early in 2017.  Instead a weak El Niño emerged in late spring, making it look for a while that 2017 might wind up being the second warmest year on record.  But the El Niño quickly dissipated and now it looks like the La Niña is finally happening.  As of November (the December data needs to be processed and won’t be available until the middle of the month), 2017 had a mean temperature anomaly of approximately 0.84ºC relative to the twentieth century mean— comfortably the third highest on record.   NOAA is predicting a weak to moderate La Niña event that will last through the winter.  I’m going to predict that this will cool the air off enough to make 2018 the fourth warmest year on record when all is said and done.  We’ll see how that prediction looks this time next year.

(For updates on new posts, please click the "Follow" button.)

Tuesday, January 2, 2018

December's Foggy Freeze

I wanted to talk today about the recent weather here on Long Island and in the northeastern United States as a whole.  The area is experiencing an extended cold spell, with the current forecast calling for a fairly significant weather event later on in the week followed by even colder temperatures.  I will wait until the end of this month to see how January plays out, but I think it's worth discussing the cold December we've had in the context of our weather history over the last half-century.

I looked at temperature records for the month of December going back to 1965 from the weather station located at LaGuardia Airport, using the Weather Underground site.  The mean temperature for December 2017 was 36°F.  How does that compare to Decembers of previous years?  The graph below plots the monthly mean December temperatures since 1965, with the most recent data point being December 2017.  As you can see, this past December was indeed cool compared to recent Decembers.  It wasn't the coldest December on record, though; nor would it have been that abnormal relative to the early part of the data set.   The decadal means for December temperatures were 36.6ºF from 1965-1969, 37.3ºF for the seventies, and 36.4ºF for the eighties.  In other words, it wasn’t that long ago when this December would have been considered very close to average.  Temperatures have risen noticeably since then, however.  The decadal mean December temperature was 39.9ºF in the nineties and 38.6ºF in the 2000s.  This decade, even given this past month, has seen a mean December temperature of 41.1ºF.  It is also worth pointing out that December 2015, with an average temperature of 51°F, was far more anomalously high than 2017 was anomalously low.  (My vegetable garden was still going strong that December, and I even had broccoli growing into the new year.)

So I think a little perspective is required when discussing the current cold spell.  Weather happens, and sometimes things get cold in the winter regardless of the overall trends in temperature.

(For updates on new posts, please click the "Follow" button.)