"The end of fish" is one more reason to believe humans are powerful enough to affect the planet

The Washington Post today summarizes findings on the decline in global fish stocks from a recent World Wildlife Federation study, a situation that Daniel Pauly at the University of British Columbia has called “the end of fish”.  The report documents declines in many different species on both land and sea, but the Post’s article focuses on fish.  Figure 10 (page 19) in the booklet summarizing the report tells the story vividly, portraying the expansion of humanity’s ability to scour the entire ocean for fish.

There are still those who doubt that humanity is powerful enough to affect the natural environment in irreversible ways, but the WWF report is one more strong piece of evidence to the contrary.

References

Donner, Simon D. 2011. “Making the climate a part of the human world."  The Bulletin of the American Meteorological Society.  vol. 92, no. 10. October. pp. 1297-1302. [http://journals.ametsoc.org/doi/pdf/10.1175/2011BAMS3219.1]

WWF. 2012. Living Planet Report 2012:  Biodiversity, Biocapacity, and Better Choices. Washington, DC: World Wildlife Fund.   [http://wwf.panda.org/about_our_earth/all_publications/living_planet_report/]

A brief synopsis of Cold Cash, Cool Climate on CSR Wire, with more to come

Corporate Social Responsibility Wire (CSR Wire) just posted a short synopsis of Cold Cash, Cool Climate in advance of a series of eleven blog posts of mine that will appear there in coming weeks.  In those posts I’m boiling down the key lessons from the book in bite-sized chunks, and I’m hopeful they will be useful to a wide audience.   I will write a short post on Koomey.com each time one of the articles appears on CSR Wire.

Here’s the intro to the synopsis:

Jonathan Koomey’s book, Cold Cash, Cold Climate provides a more robust and detailed set of arguments for entrepreneurs who want to be green – or for consultants to use in building the business case for sustainability. (Stay tuned for our new series with Koomey starting next week!)
A scientist and entrepreneur, Koomey brings the full weight of his experience to Cold Cash, Cool Climate. In cogent, fact-filled prose, his chapters on climate science lay out the case for action now, providing a welcome antidote to the hysterical propaganda of the climate cranks – like the Heartland Institute, whose latest campaign of intellectual perversion compares climate change believers to mass murderers.

Subtleties in comparing fossil fuel reserves to CO2 concentrations

As reported in Climate Progress, climatologist James Hansen has done comparisons of fossil fuel reserves from two sources to concentrations of carbon dioxide in the atmosphere, using a conversion factor of 2.12 GtC per ppm of CO2.  For example, see Figure 1 from Hansen’s more detailed writeup:

Figure 1. CO2 emissions by fossil fuels (1 ppm CO2 ~ 2.12 GtC, where ppm is parts per million of CO2 in air and GtC is gigatons of carbon). Alternative estimates of reserves and potentially recoverable resources are from EIA (2011) and GAC (2011).

Such comparisons are fraught with pitfalls.  In this case, Hansen appears to be comparing the carbon content in fuels with CO2 concentrations in the atmosphere assuming that all of the carbon from burning the fuel would go into the atmosphere. This is fine as long as it is clear what is being assumed, but this detail is not included on the graph, in the caption, or in the paper I link to above.  (There may be a more detailed paper of which I’m not aware, so if anyone knows of it, please send me a link.  I’ve also tried to find the EIA (2011) and GAC (2011) references without success.)

In reality, some fraction of emitted carbon (called the “airborne fraction”) will stay in the atmosphere and the rest will be absorbed by the oceans and biota on land.  The exact value of the airborne fraction varies.  Right now it’s roughly 50% but if we continue on our current path, it will rise.  For example, in the MIT no-policy case, the airborne fraction measured cumulatively in the 21st century would be about 65%, which demonstrates the declining ability of the oceans to take up carbon in a high emissions world.

It would be better when making graphs like Figure 1 to assume an airborne fraction (say 65% for a world in which we exploit many fossil fuel reserves) and to note that the conversion to ppm assumes that factor.  Otherwise this way of presenting the data, which has many good reasons to recommend it, may be misleading to those unfamiliar with the subtleties.

In my post on the illusion of fossil fuel abundance, I use the lower-bound reserve and resource estimates from the IIASA Global Energy Assessment to make different but related points, but don’t make the link to concentrations explicitly.  I’m working now to add ppm estimates to give the context in the graphs for that analysis.

An interesting example of innovative low-power sensing technology

According to the MIT news, researchers at MIT have created a sensors using carbon nanotubes that can tell grocers just how ripe their fruit is.  This is an interesting example of innovative low-power sensor technology that will help enable a revolution in applications of information technology to business (as I describe in this article in Technology Review).

Here are the first few paragraphs of the article:

Every year, U.S. supermarkets lose roughly 10 percent of their fruits and vegetables to spoilage, according to the Department of Agriculture. To help combat those losses, MIT chemistry professor Timothy Swager and his students have built a new sensor that could help grocers and food distributors better monitor their produce.

The new sensors, described in the journal Angewandte Chemie, can detect tiny amounts of ethylene, a gas that promotes ripening in plants. Swager envisions the inexpensive sensors attached to cardboard boxes of produce and scanned with a handheld device that would reveal the contents’ ripeness. That way, grocers would know when to put certain items on sale to move them before they get too ripe.

“If we can create equipment that will help grocery stores manage things more precisely, and maybe lower their losses by 30 percent, that would be huge,” says Swager, the John D. MacArthur Professor of Chemistry.

This example is one way that information technology can help us better match energy services demanded with those supplied, as I describe Chapter 6 of Cold Cash, Cool Climate.

A review of why new coal plant construction has slowed to a crawl in the US

Dan Lashof compiles findings from some studies analyzing why the utility industry in the US has shown little recent interest in building new coal plants.  The main causes relate to market forces, not EPA’s recently announced new source performance standards (although those will have some effect if natural gas prices reverse their recent decline and coal becomes economically competitive for generating electricity again).  These reasons overlap strongly with the reasons why utilities are now retiring many very old coal plants.

Comparing Hansen et al's 1981 projection of climate change with actual history

Real Climate has a brief post comparing the temperature projections contained in a paper in Science from 1981 [1]  to what actually happened, showing that the projections underestimated actual warming from greenhouse gas emissions by about 30%.  It’s still good agreement for a long-term forecast, and it shows that the initial projections based on physical principles have more or less come to pass.  As the article states

…a projection from 1981 for rising temperatures in a major science journal, at a time that the temperature rise was not yet obvious in the observations, has been found to agree well with the observations since then, underestimating the observed trend by about 30%, and easily beating naive predictions of no-change or a linear continuation of trends. It is also a nice example of a statement based on theory that could be falsified and up to now has withstood the test.

This is a wonderful example of retrospective comparison of a forecast with actual history, something that is rarely done [2].  Now some enterprising Ph.D. student needs to dissect the Hansen analysis, decompose the drivers of emissions and GHG concentrations into their component parts using the methods in [3], compare them to historical developments at a detailed level, determine where Hansen et al. got it right and where they got it wrong, and they’ll have a terrific thesis on their hands.

Here’s the key graph:

References

1.  J. Hansen, D. Johnson, A. Lacis, S. Lebedeff, P. Lee, D. Rind, and G. Russell, “Climate Impact of Increasing Atmospheric Carbon Dioxide”, Science, vol. 213, 1981, pp. 957-966.DOI.

2.  Koomey, Jonathan G., Paul Craig, Ashok Gadgil, and David Lorenzetti. 2003. “Improving long-range energy modeling:  A plea for historical retrospectives."  The Energy Journal (also LBNL-52448).  vol. 24, no. 4. October. pp. 75-92.   Email me for a copy

3.  Hummel, Holmes. 2006. Interpreting Global Energy and Emission Scenarios:  Methods for Understanding and Communicating Policy Insights. Thesis, Interdisciplinary Program on Environment and Resources, Stanford University. [http://www.holmeshummel.net/Dissertation.htm]

Why fossil fuel abundance is an illusion

In a blog post published on April 12, 2012, Dan Lashof of NRDC makes it clear that we’ll run out of the earth’s ability to absorb greenhouse gases long before we run out of fossil fuels.  In this blog post I’ll show why he’s exactly right.

In Cold Cash, Cool Climate, I explore this question quantitatively, using the latest fossil fuel resource estimates from IIASA’s Global Energy Assessment.  To do that, I estimate lower bounds to global fossil fuel reserves and resources (which together make up what’s called the “resource base”, our best estimate of how many fossil fuel resources we have, not including exotic supplies like methane hydrates and other occurrences of hard to extract deposits).  Reserves are well known deposits that can be extracted at current prices and technologies, while resources are somewhat more speculative, but resources become reserves over time as exploration advances and technology improves.

I focus here on the lower bounds to make an important point:  Even with estimates of the fossil fuel resource base at the low end of what the literature says, the amount of carbon embodied in just the conventional sources of these fuels is vastly larger than the amount of fuel assumed to be burned in the MIT no-policy case (which is a reasonable assessment of our “business-as-usual” future, assuming no major efforts to wean ourselves off of fossil fuels).

Figure 1:  Lower bound estimates of fossil fuel reserves compared to fossil carbon emissions in the MIT’s no-policy case

As shown in Figure 1, the lower bound estimate of the amount of carbon contained in all fossil fuels excluding exotic resources like methane hydrates is almost 10,000 billion metric tons of carbon, or roughly 6 times the amount that would be emitted from fossil fuel burning in the MIT no-policy case from 2000 to 2100. Just the resource base for conventional gas, oil, and coal would cover the fossil emissions in the no-policy case more than five times over.  And if we were to consume the conventional oil and gas resource base plus the coal reserves, we’d only need to use about 10% of the coal resources to reach the emissions in the no-policy case.  “Peak oil” won’t help much with this problem, as coal reserves are so vast.

I conclude from this comparison that there’s virtually no chance that resource constraints would provide a brake on carbon emissions in this century, and the emissions in the MIT no-policy case are below what could be expected if we were to burn even a quarter of our entire conventional resource base in the next ninety years.

When considering the challenge of keeping greenhouse gas concentrations to a level that would allow no more than a 2 Celsius degree increase in global temperatures, the situation is even more difficult.  If we take the 2 degree C warming limit seriously, we’ll need to limit CO2 emissions from all sources to no more than 315 billion tons of carbon (about 1150 billion tons of CO2) from 2000 until 2049, which implies a rapid phase out of fossil fuels (as well as other sources of GHGs).  That means that we can’t even burn the currently existing stock of proved reserves of fossil fuels and remain under the 2 degrees C warming limit (see Figure 2).  Recall that the proved reserves data from the GEA represents a lower bound, so the comparison is even more stark.  We’ll need to keep a significant fraction of our proved reserves from being burned, or we’ll need to figure out a way to sequester carbon in a safe way (which is not currently feasible on the scales needed, though it has been proved in some applications).

Figure 2:  Lower bound estimates of fossil fuel reserves compared to the MIT level 1 case and a safer climate case that would keep global temperatures from rising more than 2 degrees C from preindustrial times

One implication of these results is that the current estimated value of fossil fuel reserves (as capitalized in the stock prices of fossil fuel companies) is an illusion, as Dave Roberts of Grist points out.  We quite literally can’t burn it all and continue the orderly development of human civilization, so the trillions of dollars of “value” in those reserves is a mirage (and a major impediment to progress on this problem, given how hard the fossil fuel industry is fighting to preserve its profits).

Comment period for EPA greenhouse gas rules on new power plants opens--get your comments in!

As I discussed here, the EPA announced in March that they would be promulgating new rules on the greenhouse gas emissions associated with new power plants that would effectively prohibit construction of new conventional coal plants without carbon capture and storage.  Such plants were already feeling pressure from cheap natural gas and lower electricity demand, so there are few in the pipeline now.   As Dave Roberts of Grist points out, the regulations will prevent an upsurge in new coal plant orders if gas gets more expensive–that’s their most important function.

Climate Progress has an article summarizing the regulations and announcing the beginning of the public comment period.  If you have time (and it takes only a few minutes) go here and register your comments on NRDC’s web site.  They promise to forward your comments on to the EPA and to your senators and congressperson. The time for action is now!

"The computing trend that will change everything", in Technology Review online today

One of my professors in graduate school, John Holdren, had a goal of writing a paper out of every talk he gave.  Neither he nor I always accomplished that goal, but his advice stuck with me as something to which to aspire.

After I gave my talk on computing trends at the VERGE conference in DC on March 15th, 2012, I was offered the opportunity to write a short article for a special issue of Technology review.  That article appeared online today as part of an entire issue devoted to how information technology and energy interact, and the process of writing it helped me refine my thinking even more.  Packing a lot of ideas into an 800 word article isn’t easy, and in the process I learned as much as I did in boiling down my computing trends work down to a 10 minute talk at VERGE.

The main focus of the article is on implications for our ability to collect and use data in real time, at increasingly finer levels of disaggregation.  I’m convinced that sensors and controls that harvest ambient energy flows offer a chance to transform how we interact with the universe, and I’m excited to see what new capabilities they will enable in the years ahead.

Here are the opening four paragraphs:

The performance of computers has shown remarkable and steady growth, doubling every year and a half since the 1970s. What most folks don’t know, however, is that theelectrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity used) has also doubled every year and a half since the dawn of the computer age.
Laptops and mobile phones owe their existence to this trend, which has led to rapid reductions in the power consumed by battery-powered computing devices. The most important future effect is that the power needed to perform a task requiring a fixed number of computations will continue to fall by half every 1.5 years (or a factor of 100 every decade). As a result, even smaller and less power-intensive computing devices will proliferate, paving the way for new mobile computing and communications applications that vastly increase our ability to collect and use data in real time.
As one of many examples of what is becoming possible using ultra-low-power computing, consider the wireless no-battery sensors created by Joshua R. Smith of the University of Washington. These sensors harvest energy from stray television and radio signals and transmit data from a weather station to an indoor display every five seconds. They use so little power (50 microwatts, on average) that they don’t need any other power source.
Harvesting background energy flows, including ambient light, motion, or heat, opens up the possibility of mobile sensors operating indefinitely with no external power source, and that means an explosion of available data. Mobile sensors expand the promise of what Erik Brynjolfsson, a professor of management at MIT calls “nanodata,” or customized fine-grained data describing in detail the characteristics of individuals, transactions, and information flows.

Video of my talk at the VERGE conference on "The computing trend that will change everything"

I gave a talk on the implications of the trends in the energy efficiency of computing at the VERGE conference in Arlington, VA on March 15, 2012.  I just received the video and have embedded it below.  It’s what the VERGE folks call a “one great idea” talk, and preparing for this event made me really synthesize my thinking (because I had to boil it down to 10 minutes or less).  Looking back there are a couple of things I’d change, but it’s not a bad summary of my current thinking on the topic.

The key factors causing coal plant retirements

Dave Roberts has posted a cogent summary of the key implications of the new EPA rule related to new coal plants.  These points follow:

1. This rule applies to new power plants, not existing or currently permitted plants, meaning it won’t have any real effect until well after the election.
2. The rule marks (but did not cause) the end of new coal plants in the U.S.
3. This rule is the easier one. The tougher one, which will apply to existing plants, will come later, probably after the election.
4. The ongoing wave of coal-plant retirements has little to do with EPA rules.
5. EPA rules are not job killers or economic burdens.
6. Some day the natural gas bubble will pop and prices will return to earth.

I want to focus on #4 (and on a terrific paper by Susan Tierney on that topic), because it gives interesting insight into the factors that cause utilities to retire power plants.

The first thing to understand (as I pointed out in Cold Cash, Cool Climate) is

About 15% of existing US coal plants (about 50 GW out of 300 GW total) are old, inefficient, polluting plants that were grandfathered under the Clean Air Act, so they have few or no pollution controls.[1] More than half of US coal plants are 35 years of age or older.[2] The total social cost of running many of these plants is higher than the cost of alternative ways of supplying that electricity (even without counting the damages from greenhouse gas emissions),[3] so they represent an obsolete capital stock from society’s perspective.

These older plants are comparatively inefficient, even though they have few or no pollution controls, so it’s not surprising that they are the ones being retired in the face of the economic forces outlined by Tierney.  That report points to increasing coal prices, decreasing natural gas prices, and declining electricity demand as the main factors thus far in encouraging the retirement of existing coal plants.

While these factors have played a dominant role so far, it’s likely that the EPA mercury and air toxic rules were the final “nail in the coffin” for some of these plants, and these rules will encourage additional plants to retire in the years ahead.  This is because the cost of retrofitting these plants to meet the new standards is large enough to make the utilities think twice about retrofitting, especially when so many lower emission power generation alternatives are available.

What these rules are doing is starting to bring the societal costs of coal-fired generation to bear on utility decisionmaking.  Until recently, these old coal plants were getting a free ride, belching out pollution and not paying for the true costs of the electricity they generate.  That’s now changing, and our society will be better for it.

So if you hear someone complaining that these regulations cost too much, always ask “Cost to whom?”.  What they mean is that the regulations will cost utilities money, and that’s often (but not always) true.  That’s not the same thing as saying that the regulations are not worth it from society’s perspective, and that’s really the metric we need to apply.  And the numbers prove it–Isaac Shapiro of the Economic Policy Institute tallied the costs and benefits of the three big Obama administration’s EPA proposed rules (not including this latest one on new power plants) and found benefit-cost ratios ranging from 6:1 to 15:1, and total net benefits for the US (after counting costs) of $60 to almost $200 billion per year.  Sounds like an awfully good deal to me.


[1] Celebi, Metin, Frank C. Graves, Gunjan Bathla, and Lucas Bressan. 2010. Potential Coal Plant Retirements Under Emerging Environmental Regulations. The Brattle Group, Inc.  December 8. [http://www.brattle.com/documents/uploadlibrary/upload898.pdf]

[2] See Figure 5-6 in Lovins, Amory B., Mathias Bell, Lionel Bony, Albert Chan, Stephen Doig, Nathan J. Glasgow, Lena Hansen, Virginia Lacy, Eric Maurer, Jesse Morris, James Newcomb, Greg Rucks, and Caroline Traube. 2011. Reinventing Fire:  Bold Business Solutions for the New Energy Era. White River Junction, VT: Chelsea Green Publishing, p. 175.

[3] For details, see Muller, Nicholas Z., Robert Mendelsohn, and William Nordhaus. 2011. “Environmental Accounting for Pollution in the United States Economy."  American Economic Review vol. 101, no. 5. August. pp. 1649–1675, and Epstein, Paul R., Jonathan J. Buonocore, Kevin Eckerle, Michael Hendryx, Benjamin M. Stout III, Richard Heinberg, Richard W. Clapp, Beverly May, Nancy L. Reinhart, Melissa M. Ahern, Samir K. Doshi, and Leslie Glustrom. 2011. "Full cost accounting for the life cycle of coal."  Annals of the New York Academy of Sciences.  vol. 1219, no. 1. February 17. pp. 73-98. [http://dx.doi.org/10.1111/j.1749-6632.2010.05890.x].

EPA to promulgate greenhouse gas emissions standards for power plants!

This is big news.  The Washington Post is reporting that the Obama Administration is about to propose regulations on new power plants that would limit greenhouse gas emissions per kWh.  Juliet Eilperin writes

The Environmental Protection Agency will issue the first limits on greenhouse gas emissions from new power plants as early as Tuesday, according to several people briefed on the proposal. The move could end the construction of conventional coal-fired facilities in the United States.
The proposed rule — years in the making and approved by the White House after months of review — will require any new power plant to emit no more than 1,000 pounds of carbon dioxide per megawatt [sic–should be megawatt-hour] of electricity produced. The average U.S. natural gas plant, which emits 800 to 850 pounds of CO2 per megawatt[-hour], meets that standard; coal plants emit an average of 1,768 pounds of carbon dioxide per megawatt[-hour].

The article also describes how the older dirtier plants are already under pressure from the air toxic rules as well as cheap natural gas.  Many existing plants will be retired because they are no longer economic to run, given these new regulations.  And as recent economic analysis has shown, coal fired generation is actually costing us far more than alternatives when you count the costs of pollution to society (which until these new rules came into force weren’t being paid by the operators of coal plants, they were being paid mostly by old people and children hurt by coal related pollution).

Emission control rules like these pay for themselves many times over from society’s perspective, so all the gloom and doom raised by the pro-pollution forces about their dire economic consequences is just nonsense. We are already paying far more for electricity as a society than we need to, and by phasing out many coal power plants we’ll be REDUCING the cost of electricity to society.  These regulations will be good for the economy AND good for the environment.  For details on the unpaid external (pollution) costs of coal, see the references below.

Muller, Nicholas Z., Robert Mendelsohn, and William Nordhaus. 2011. “Environmental Accounting for Pollution in the United States Economy.” American Economic Review vol. 101, no. 5. August. pp. 1649–1675.  

Epstein, Paul R., Jonathan J. Buonocore, Kevin Eckerle, Michael Hendryx, Benjamin M. Stout Iii, Richard Heinberg, Richard W. Clapp, Beverly May, Nancy L. Reinhart, Melissa M. Ahern, Samir K. Doshi, and Leslie Glustrom. 2011. “Full cost accounting for the life cycle of coal.” Annals of the New York Academy of Sciences. vol. 1219, no. 1. February 17. pp. 73-98. [http://dx.doi.org/10.1111/j.1749-6632.2010.05890.x…

A brilliant graph summarizing the communications quandary about climate change

Michael Tobis (with help from Stephen Ban) posted a wonderful schematic graph back in January 2012 contrasting the state of media coverage about climate with what informed opinion actually says.  It’s not intended to be quantitatively accurate, but I think it conveys things pretty well, based on my experience in following this issue over the past couple of decades.  Here’s the graph:

Joe Romm, over at Climate Progress, has used this graph effectively to discuss the issue of false balance in media reporting.  If you talk about what the literature says about the path we’re now on (5 degrees C warming by 2100, as I discuss in Cold Cash, Cool Climate) you’re labeled by the mainstream media folks as an “alarmist”.  But what if the truth is really alarming?  I’m sure dinosaurs who raised the issue of a possible meteor strike were labeled alarmists also (I’m being facetious, folks), but that didn’t help most of those critters to survive that particular disaster when it happened.

Reality doesn’t care about what we think is plausible, and the more society ignores reality as best scientists understand it, the more likely it is that we’ll misjudge, with potentially catastrophic consequences for humanity and the earth’s natural systems.

I need your help for a guerrilla campaign to get Cold Cash, Cool Climate on the New Earth Archive book list for 2012

The New Earth Archive opened voting for the most “powerful and influential books on topics like climate change, sociology, economics, politics, technology, philosophy, and many, many other areas.”  They are asking for “help in choosing the 25 that have the power to inspire college readers to change the world,” based on the criteria listed below.

I need your help to get Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs on this list, since it is so new they didn’t list it on their default choices.  That means you’ll need to add it under “your personal recommendation” at the bottom right.  You’ll need to vote for between 10 and 15 books (they won’t let you submit less than 10) and you’ll need to supply your name and email address

I suggested you paste in the following text:  Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs, by Jonathan Koomey

Many thanks for your help!

Jon

The criteria to be used to judge the books are below:

“The books that comprise this archive should educate and empower students to work towards improving the world around them, and should inspire open-mindedness and acceptance of new, more beneficial ideas and lifestlyes. For this goal to be succesful, the books that populate this collection should be:

1. Books that college students will actively want to read: they have to be informative and passionate enough to capture and then hold students’ attention.

2. Powerful enough to change the reader’s mind about these issues, encourage new perspectives, and promote acceptance of alternate or controversial ideas.

3. Provocative and motivational: they have to genuinely drive readers to want to change the world.

4. Inspiring and positive, and offer solutions that encourage pursuit of holistic and truly fulfilling lifestyles.

5. Recent (within the past 10-15 years) and still relevant to our world’s rapidly changing conditions.

6. Decades-big ideas with a worldwide scope: topics that span the environment, economics, people, politics, and revelations in understanding human nature.

Rocky Mountain Institute (RMI) just posted a short summary of Cold Cash, Cool Climate

Rocky Mountain Institute (RMI), where I am affiliated as a Senior Fellow, just posted my short summary of key arguments from Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs. For the full post, go here.

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute