We're still on track for 2 doublings of greenhouse gas concentrations by 2100

Joe Romm at Climate Progress today reported on a new International Energy Agency report showing that carbon dioxide emissions hit a new record in 2011 or 31.6 billion tons (CO2).  The key point that Romm makes is that these emissions increases are consistent with the MIT no-policy case, and that means two doublings of greenhouse gas concentrations by 2100.

Students of climate science know that every doubling will likely yield 3 celsius degrees of warming, so two doublings means committing the earth to 6 celsius degrees of warming.  As Cold Cash, Cool Climate points out, such increases will push the earth well outside the comfortable range in which humanity evolved (the envelope defined by the 2 celsius degree warming limit), and will raise the possibility of nasty positive feedbacks leading to even higher warming (like releasing methane hydrates from the ocean floor, melting permafrost, and burning peat bogs).

At least US carbon dioxide emissions are down 7.7% since 2006, but those improvements were offset by emissions increases in China and other developing countries.  Time to get serious about global emissions reductions!

People believe weird things: Case in point, Federal spending and taxes in the Obama years

http://www.marketwatch.com/story/obama-spending-binge-never-happened-2012-05-22?pagenumber=1

http://tpmdc.talkingpointsmemo.com/2012/05/federal-deficit-barack-obama-spending-stimulus-budget-historic-trends.php

http://www.theatlantic.com/business/archive/2012/05/federal-spending-taxes-and-deficits-are-lower-today-than-when-obama-took-office/257256/

http://www.nytimes.com/interactive/2009/06/09/business/economy/20090610-leonhardt-graphic.html

http://www.nytimes.com/2009/06/10/business/economy/10leonhardt.html

http://www.theatlantic.com/business/archive/2012/02/the-big-deficit-lie-every-gop-debt-plan-leaves-us-with-more-debt/253501/

http://tpmdc.talkingpointsmemo.com/2012/03/government-jobs-bouyed-bushs-economy-and-sunk-obamas-chart.php

http://www.washingtonpost.com/blogs/ezra-klein/post/the-reality-behind-obama-and-bushs-spending-binge/2012/05/25/gJQAK8ItpU_blog.html?hpid=z3

My first post on CSRwire introducing Cold Cash, Cool Climate is now up

CSRwire just posted the first of 11 short essays that I adapted from material in Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs.  This essay summarizes the main findings from climate science and explains why I chose to write for entrepreneurs and not the usual policy crowd.

The main page that will contain a list of all the posts as they appear is here.

Review of Cold Cash, Cool Climate by Rick Piltz of Climate Science Watch

Rick Piltz of Climate Science Watch today posted a review of Cold Cash, Cool Climate.  Here’s his top line summary:

Jonathan Koomey’s new book, Cold Cash, Cool Climate: Science-Based Advice for Ecological Entrepreneurs, offers a concise, compelling analysis of why innovative entrepreneurial approaches are needed in order to limit global climate change, and to improve the quality of life while doing so.  Koomey’s analysis has more integrity than those who promote energy alternatives while evading the daunting constraints that follow from climate science, and opens into a creative and integrative way of thinking about paths forward.  Highly recommended.

Rick has asked me four follow-on questions that I’ll address in some upcoming blog posts, so stay tuned.

Update:  On Tuesday, May 22, 2012, Joe Romm reposted the review on Climate Progress.

Yet another example of corporations putting profits ahead of the public interest

Nicholas Kristof wrote a summary article this past Sunday about a series in the Chicago Tribune showing how flame retardants came to be in furniture and other household products.  This is yet another example of corporate interests putting their own profits ahead of the public interest, and it’s no surprise to anyone who has followed the tobacco industry (Brandt 2007) or the deniers’ disinformers’ tactics to derail climate science (Oreskes 2011).

Of course, I don’t object to profits (I love them as well as the next business person) but when companies force costs onto society that are not paid by customers of their products then society as a whole is damaged (even though the perpetrators make out like bandits). It’s both unfair and inefficient, and  if we can avoid it, there’s no legitimate justification for allowing this to happen for climate, consumer products, or anything else.  In addition, such behavior damages people’s faith in the free enterprise system, which is another systemic cost of such bad behavior.  Of course the only way to enforce rules to correct such problems is through government, which is why I talk about “The central problem of governance” and “What kind of government do we want?” in Chapter 7 of Cold Cash, Cool Climate.

References

Brandt, Allan M. 2007. The Cigarette Century:  The Rise, Fall, and Deadly Persistence of the Product that Defined America. New York, NY: Basic Books.

Koomey, Jonathan G. 2011. Cold Cash, Cool Climate:  Science-Based Advice for Ecological Entrepreneurs. Burlingame, CA: Analytics Press.

Oreskes, Naomi, and Eric M. Conway. 2010. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury Press.

"The end of fish" is one more reason to believe humans are powerful enough to affect the planet

The Washington Post today summarizes findings on the decline in global fish stocks from a recent World Wildlife Federation study, a situation that Daniel Pauly at the University of British Columbia has called “the end of fish”.  The report documents declines in many different species on both land and sea, but the Post’s article focuses on fish.  Figure 10 (page 19) in the booklet summarizing the report tells the story vividly, portraying the expansion of humanity’s ability to scour the entire ocean for fish.

There are still those who doubt that humanity is powerful enough to affect the natural environment in irreversible ways, but the WWF report is one more strong piece of evidence to the contrary.

References

Donner, Simon D. 2011. “Making the climate a part of the human world."  The Bulletin of the American Meteorological Society.  vol. 92, no. 10. October. pp. 1297-1302. [http://journals.ametsoc.org/doi/pdf/10.1175/2011BAMS3219.1]

WWF. 2012. Living Planet Report 2012:  Biodiversity, Biocapacity, and Better Choices. Washington, DC: World Wildlife Fund.   [http://wwf.panda.org/about_our_earth/all_publications/living_planet_report/]

A brief synopsis of Cold Cash, Cool Climate on CSR Wire, with more to come

Corporate Social Responsibility Wire (CSR Wire) just posted a short synopsis of Cold Cash, Cool Climate in advance of a series of eleven blog posts of mine that will appear there in coming weeks.  In those posts I’m boiling down the key lessons from the book in bite-sized chunks, and I’m hopeful they will be useful to a wide audience.   I will write a short post on Koomey.com each time one of the articles appears on CSR Wire.

Here’s the intro to the synopsis:

Jonathan Koomey’s book, Cold Cash, Cold Climate provides a more robust and detailed set of arguments for entrepreneurs who want to be green – or for consultants to use in building the business case for sustainability. (Stay tuned for our new series with Koomey starting next week!)
A scientist and entrepreneur, Koomey brings the full weight of his experience to Cold Cash, Cool Climate. In cogent, fact-filled prose, his chapters on climate science lay out the case for action now, providing a welcome antidote to the hysterical propaganda of the climate cranks – like the Heartland Institute, whose latest campaign of intellectual perversion compares climate change believers to mass murderers.

Subtleties in comparing fossil fuel reserves to CO2 concentrations

As reported in Climate Progress, climatologist James Hansen has done comparisons of fossil fuel reserves from two sources to concentrations of carbon dioxide in the atmosphere, using a conversion factor of 2.12 GtC per ppm of CO2.  For example, see Figure 1 from Hansen’s more detailed writeup:

Figure 1. CO2 emissions by fossil fuels (1 ppm CO2 ~ 2.12 GtC, where ppm is parts per million of CO2 in air and GtC is gigatons of carbon). Alternative estimates of reserves and potentially recoverable resources are from EIA (2011) and GAC (2011).

Such comparisons are fraught with pitfalls.  In this case, Hansen appears to be comparing the carbon content in fuels with CO2 concentrations in the atmosphere assuming that all of the carbon from burning the fuel would go into the atmosphere. This is fine as long as it is clear what is being assumed, but this detail is not included on the graph, in the caption, or in the paper I link to above.  (There may be a more detailed paper of which I’m not aware, so if anyone knows of it, please send me a link.  I’ve also tried to find the EIA (2011) and GAC (2011) references without success.)

In reality, some fraction of emitted carbon (called the “airborne fraction”) will stay in the atmosphere and the rest will be absorbed by the oceans and biota on land.  The exact value of the airborne fraction varies.  Right now it’s roughly 50% but if we continue on our current path, it will rise.  For example, in the MIT no-policy case, the airborne fraction measured cumulatively in the 21st century would be about 65%, which demonstrates the declining ability of the oceans to take up carbon in a high emissions world.

It would be better when making graphs like Figure 1 to assume an airborne fraction (say 65% for a world in which we exploit many fossil fuel reserves) and to note that the conversion to ppm assumes that factor.  Otherwise this way of presenting the data, which has many good reasons to recommend it, may be misleading to those unfamiliar with the subtleties.

In my post on the illusion of fossil fuel abundance, I use the lower-bound reserve and resource estimates from the IIASA Global Energy Assessment to make different but related points, but don’t make the link to concentrations explicitly.  I’m working now to add ppm estimates to give the context in the graphs for that analysis.

An interesting example of innovative low-power sensing technology

According to the MIT news, researchers at MIT have created a sensors using carbon nanotubes that can tell grocers just how ripe their fruit is.  This is an interesting example of innovative low-power sensor technology that will help enable a revolution in applications of information technology to business (as I describe in this article in Technology Review).

Here are the first few paragraphs of the article:

Every year, U.S. supermarkets lose roughly 10 percent of their fruits and vegetables to spoilage, according to the Department of Agriculture. To help combat those losses, MIT chemistry professor Timothy Swager and his students have built a new sensor that could help grocers and food distributors better monitor their produce.

The new sensors, described in the journal Angewandte Chemie, can detect tiny amounts of ethylene, a gas that promotes ripening in plants. Swager envisions the inexpensive sensors attached to cardboard boxes of produce and scanned with a handheld device that would reveal the contents’ ripeness. That way, grocers would know when to put certain items on sale to move them before they get too ripe.

“If we can create equipment that will help grocery stores manage things more precisely, and maybe lower their losses by 30 percent, that would be huge,” says Swager, the John D. MacArthur Professor of Chemistry.

This example is one way that information technology can help us better match energy services demanded with those supplied, as I describe Chapter 6 of Cold Cash, Cool Climate.

A review of why new coal plant construction has slowed to a crawl in the US

Dan Lashof compiles findings from some studies analyzing why the utility industry in the US has shown little recent interest in building new coal plants.  The main causes relate to market forces, not EPA’s recently announced new source performance standards (although those will have some effect if natural gas prices reverse their recent decline and coal becomes economically competitive for generating electricity again).  These reasons overlap strongly with the reasons why utilities are now retiring many very old coal plants.

Comparing Hansen et al's 1981 projection of climate change with actual history

Real Climate has a brief post comparing the temperature projections contained in a paper in Science from 1981 [1]  to what actually happened, showing that the projections underestimated actual warming from greenhouse gas emissions by about 30%.  It’s still good agreement for a long-term forecast, and it shows that the initial projections based on physical principles have more or less come to pass.  As the article states

…a projection from 1981 for rising temperatures in a major science journal, at a time that the temperature rise was not yet obvious in the observations, has been found to agree well with the observations since then, underestimating the observed trend by about 30%, and easily beating naive predictions of no-change or a linear continuation of trends. It is also a nice example of a statement based on theory that could be falsified and up to now has withstood the test.

This is a wonderful example of retrospective comparison of a forecast with actual history, something that is rarely done [2].  Now some enterprising Ph.D. student needs to dissect the Hansen analysis, decompose the drivers of emissions and GHG concentrations into their component parts using the methods in [3], compare them to historical developments at a detailed level, determine where Hansen et al. got it right and where they got it wrong, and they’ll have a terrific thesis on their hands.

Here’s the key graph:

References

1.  J. Hansen, D. Johnson, A. Lacis, S. Lebedeff, P. Lee, D. Rind, and G. Russell, “Climate Impact of Increasing Atmospheric Carbon Dioxide”, Science, vol. 213, 1981, pp. 957-966.DOI.

2.  Koomey, Jonathan G., Paul Craig, Ashok Gadgil, and David Lorenzetti. 2003. “Improving long-range energy modeling:  A plea for historical retrospectives."  The Energy Journal (also LBNL-52448).  vol. 24, no. 4. October. pp. 75-92.   Email me for a copy

3.  Hummel, Holmes. 2006. Interpreting Global Energy and Emission Scenarios:  Methods for Understanding and Communicating Policy Insights. Thesis, Interdisciplinary Program on Environment and Resources, Stanford University. [http://www.holmeshummel.net/Dissertation.htm]

Why fossil fuel abundance is an illusion

In a blog post published on April 12, 2012, Dan Lashof of NRDC makes it clear that we’ll run out of the earth’s ability to absorb greenhouse gases long before we run out of fossil fuels.  In this blog post I’ll show why he’s exactly right.

In Cold Cash, Cool Climate, I explore this question quantitatively, using the latest fossil fuel resource estimates from IIASA’s Global Energy Assessment.  To do that, I estimate lower bounds to global fossil fuel reserves and resources (which together make up what’s called the “resource base”, our best estimate of how many fossil fuel resources we have, not including exotic supplies like methane hydrates and other occurrences of hard to extract deposits).  Reserves are well known deposits that can be extracted at current prices and technologies, while resources are somewhat more speculative, but resources become reserves over time as exploration advances and technology improves.

I focus here on the lower bounds to make an important point:  Even with estimates of the fossil fuel resource base at the low end of what the literature says, the amount of carbon embodied in just the conventional sources of these fuels is vastly larger than the amount of fuel assumed to be burned in the MIT no-policy case (which is a reasonable assessment of our “business-as-usual” future, assuming no major efforts to wean ourselves off of fossil fuels).

Figure 1:  Lower bound estimates of fossil fuel reserves compared to fossil carbon emissions in the MIT’s no-policy case

As shown in Figure 1, the lower bound estimate of the amount of carbon contained in all fossil fuels excluding exotic resources like methane hydrates is almost 10,000 billion metric tons of carbon, or roughly 6 times the amount that would be emitted from fossil fuel burning in the MIT no-policy case from 2000 to 2100. Just the resource base for conventional gas, oil, and coal would cover the fossil emissions in the no-policy case more than five times over.  And if we were to consume the conventional oil and gas resource base plus the coal reserves, we’d only need to use about 10% of the coal resources to reach the emissions in the no-policy case.  “Peak oil” won’t help much with this problem, as coal reserves are so vast.

I conclude from this comparison that there’s virtually no chance that resource constraints would provide a brake on carbon emissions in this century, and the emissions in the MIT no-policy case are below what could be expected if we were to burn even a quarter of our entire conventional resource base in the next ninety years.

When considering the challenge of keeping greenhouse gas concentrations to a level that would allow no more than a 2 Celsius degree increase in global temperatures, the situation is even more difficult.  If we take the 2 degree C warming limit seriously, we’ll need to limit CO2 emissions from all sources to no more than 315 billion tons of carbon (about 1150 billion tons of CO2) from 2000 until 2049, which implies a rapid phase out of fossil fuels (as well as other sources of GHGs).  That means that we can’t even burn the currently existing stock of proved reserves of fossil fuels and remain under the 2 degrees C warming limit (see Figure 2).  Recall that the proved reserves data from the GEA represents a lower bound, so the comparison is even more stark.  We’ll need to keep a significant fraction of our proved reserves from being burned, or we’ll need to figure out a way to sequester carbon in a safe way (which is not currently feasible on the scales needed, though it has been proved in some applications).

Figure 2:  Lower bound estimates of fossil fuel reserves compared to the MIT level 1 case and a safer climate case that would keep global temperatures from rising more than 2 degrees C from preindustrial times

One implication of these results is that the current estimated value of fossil fuel reserves (as capitalized in the stock prices of fossil fuel companies) is an illusion, as Dave Roberts of Grist points out.  We quite literally can’t burn it all and continue the orderly development of human civilization, so the trillions of dollars of “value” in those reserves is a mirage (and a major impediment to progress on this problem, given how hard the fossil fuel industry is fighting to preserve its profits).

Comment period for EPA greenhouse gas rules on new power plants opens--get your comments in!

As I discussed here, the EPA announced in March that they would be promulgating new rules on the greenhouse gas emissions associated with new power plants that would effectively prohibit construction of new conventional coal plants without carbon capture and storage.  Such plants were already feeling pressure from cheap natural gas and lower electricity demand, so there are few in the pipeline now.   As Dave Roberts of Grist points out, the regulations will prevent an upsurge in new coal plant orders if gas gets more expensive–that’s their most important function.

Climate Progress has an article summarizing the regulations and announcing the beginning of the public comment period.  If you have time (and it takes only a few minutes) go here and register your comments on NRDC’s web site.  They promise to forward your comments on to the EPA and to your senators and congressperson. The time for action is now!

"The computing trend that will change everything", in Technology Review online today

One of my professors in graduate school, John Holdren, had a goal of writing a paper out of every talk he gave.  Neither he nor I always accomplished that goal, but his advice stuck with me as something to which to aspire.

After I gave my talk on computing trends at the VERGE conference in DC on March 15th, 2012, I was offered the opportunity to write a short article for a special issue of Technology review.  That article appeared online today as part of an entire issue devoted to how information technology and energy interact, and the process of writing it helped me refine my thinking even more.  Packing a lot of ideas into an 800 word article isn’t easy, and in the process I learned as much as I did in boiling down my computing trends work down to a 10 minute talk at VERGE.

The main focus of the article is on implications for our ability to collect and use data in real time, at increasingly finer levels of disaggregation.  I’m convinced that sensors and controls that harvest ambient energy flows offer a chance to transform how we interact with the universe, and I’m excited to see what new capabilities they will enable in the years ahead.

Here are the opening four paragraphs:

The performance of computers has shown remarkable and steady growth, doubling every year and a half since the 1970s. What most folks don’t know, however, is that theelectrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity used) has also doubled every year and a half since the dawn of the computer age.
Laptops and mobile phones owe their existence to this trend, which has led to rapid reductions in the power consumed by battery-powered computing devices. The most important future effect is that the power needed to perform a task requiring a fixed number of computations will continue to fall by half every 1.5 years (or a factor of 100 every decade). As a result, even smaller and less power-intensive computing devices will proliferate, paving the way for new mobile computing and communications applications that vastly increase our ability to collect and use data in real time.
As one of many examples of what is becoming possible using ultra-low-power computing, consider the wireless no-battery sensors created by Joshua R. Smith of the University of Washington. These sensors harvest energy from stray television and radio signals and transmit data from a weather station to an indoor display every five seconds. They use so little power (50 microwatts, on average) that they don’t need any other power source.
Harvesting background energy flows, including ambient light, motion, or heat, opens up the possibility of mobile sensors operating indefinitely with no external power source, and that means an explosion of available data. Mobile sensors expand the promise of what Erik Brynjolfsson, a professor of management at MIT calls “nanodata,” or customized fine-grained data describing in detail the characteristics of individuals, transactions, and information flows.

Video of my talk at the VERGE conference on "The computing trend that will change everything"

I gave a talk on the implications of the trends in the energy efficiency of computing at the VERGE conference in Arlington, VA on March 15, 2012.  I just received the video and have embedded it below.  It’s what the VERGE folks call a “one great idea” talk, and preparing for this event made me really synthesize my thinking (because I had to boil it down to 10 minutes or less).  Looking back there are a couple of things I’d change, but it’s not a bad summary of my current thinking on the topic.

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute