This is a huge deal: Exxon agrees to evaluate the "stranded asset" risk associated with climate action

image

The financial markets are starting to realize that the “booked” reserves of the fossil fuel companies are based on a fallacy:  these estimates assume that those fossil fuel resources can be burned and still maintain a stable climate.  As I and others have written for years, we must keep a significant fraction (about three quarters) of known fossil resources in the ground to have any hope of stabilizing global temperatures at or near 2 C above preindustrial times.

Yesterday Exxon Mobil made a major announcement, summarized by the New York Times as follows:

Energy companies have been under increasing pressure from shareholder activists in recent years to warn investors of the risks that stricter limits on carbon emissions would place on their business.
On Thursday, a shareholder group said that it had won its biggest prize yet, when Exxon Mobil became the first oil and gas producer to agree to publish that information by the end of the month.
In return, the shareholders, led by the wealth management firm Arjuna Capital, which focuses on sustainability, and the advocacy group As You Sow, said they had agreed to withdraw a resolution on the issue at Exxon Mobil’s annual meeting.

There has been some excellent coverage in the mainstream media on this development, the key articles of which I list below.

Wall Street Journal

New York Times

Reuters

MarketWatch

The Guardian

The Dallas Business Journal

Ceres.org, which was heavily involved in obtaining Exxon’s agreement for these disclosures, deserves great credit for helping make this happen.

The reason why this development is so important is because once markets realize there’s an arbitrage opportunity, they relentlessly chip away at it until it is eliminated.  And the stranded fossil asset arbitrage opportunity is one that’s worth many trillions of dollars.  So the pressure will continue to build, and soon the disclosures will result in attention paid to this asset risk that simply hasn’t been present before.  That attention will become a flood very rapidly.  It’s the beginning of the end of the fossil fuel economy, but the big players just don’t realize it yet (or if they realize it, they’re not admitting it).

As I wrote in my recent article titled “Moving Beyond Benefit-Cost Analysis of Climate Change” in the peer reviewed journal Environmental Research Letters:

Meeting the 2 Celsius warming limit implies that a significant fraction of proved fossil fuel reserves simply can’t be burned [4], or we’ll need to figure out a way to sequester carbon in a safe way (which is not currently feasible on the scales needed, though it has been proved in some applications).  This line of argument has achieved recent prominence through the writings of Bill McKibben [23] and Al Gore [24], but it was first put forth in 1989 in Krause et al. [9], and it’s a direct result of the “working forward toward a warming limit” method.
This conclusion is ominous for those now fighting to build more emissions intensive infrastructure.  There’s a real business risk to them because once the world finally accepts that rapid reductions of emission are required (which must happen soon if we’re to have any chance of stabilizing the climate), those investors will lose their money.  When markets turn, they do so with terrifying speed, and this time will be no exception.
Refs cited
4.         Koomey, Jonathan G. 2012. Cold Cash, Cool Climate:  Science-Based Advice for Ecological Entrepreneurs. Burlingame, CA: Analytics Press.
9.         Krause, Florentin, Wilfred Bach, and Jon Koomey. 1989. From Warming Fate to Warming Limit:  Benchmarks to a Global Climate Convention. El Cerrito, CA: International Project for Sustainable Energy Paths.
23.       McKibben, Bill. 2012. “Global Warming’s Terrifying New Math.” In Rolling Stone Magazine. July 19.
24.       Gore, Al, and David Blood. 2013. “The Coming Carbon Asset Bubble.” The Wall Street Journal (online).   October 29.

Back in the late 1980s when I first started working on this issue, my colleagues and I knew we would eventually see a turning point in this battle to stabilize the climate, but we’ve all been shocked by how long it’s taken. We’ve known the basic outlines of the problem since then, and the warnings have only gotten more dire.  Let’s hope this is the turning point we need to finally move towards rapid emissions reductions.  There’s simply no more time to waste.

Yale Environment 360 on energy harvesting

Cheryl Katz at Yale Environment 360 wrote a nice article on energy harvesting in which I’m quoted.  Here are the two intro paragraphs:

Computers feasting on their own exhaust heat. Super-efficient solar panels snaring lost thermal energy and recycling it into electricity. Personal electronics powered by stray microwaves or vibration-capturing clothing. Cellphones charged with a user’s footsteps. These and more innovations may be possible with free, green energy that is now going to waste.

Ubiquitous sources like radio waves, vibration and pressure created by moving objects, heat radiating from machines and even our bodies — all have the potential to produce usable electric power. Until recently, ambient energy was largely squandered because of a lack of ways to efficiently exploit it. Now, advances in materials and engineering are providing tools to harvest this abundant resource and transform it into cheap, clean electricity.
polar jet stream

Photo credit:  Lawrence Berkeley National Laboratory

The Bacteriophage Power Generator is made of biopolymer layers that produce electricity when squeezed.

Energy harvesting is one aspect of the technological revolution that’s driving the creation and adoption of battery powered information technology.  We explore this and other trends in computing efficiency, communications efficiency, low power “sleep states”, and battery technology in our “Smart Everything” article in the Annual Review of Environment and Resources last fall.  To download a free copy of that article, go here.

Koomey, Jonathan G., H. Scott Matthews, and Eric Williams. 2013. “Smart Everything:  Will Intelligent Systems Reduce Resource Use?"  The Annual Review of Environment and Resources.  vol. 38, October. pp. 311-343.

My new blog post on the Corporate Eco Forum site: "Bringing Enterprise Computing into the 21st Century: A Management and Sustainability Challenge"

The Corporate Eco Forum (CEF) just posted a new essay of mine titled "Bringing Enterprise Computing into the 21st Century: A Management and Sustainability Challenge".

Here are two key intro paragraphs:

Most companies view IT as a cost center, but the most effective users of these new technologies understand that IT can be a cost-reducing profit center that also improves environmental performance. Reforming IT is imperative for organizations that want to innovate faster and compete more effectively, a fact that modern enterprises ignore at their peril.

The critical factor is senior management attention to restructuring enterprise IT, since addressing these issues is primarily an organizational challenge that is “above the pay grade” of those responsible for datacenter operations. Companies that tackle this challenge will reap rewards in increased revenues, lower costs, faster innovation, and greater business agility. It’s a win-win-win-win, but most upper managers ignore this potential, treating IT as a cost of doing business that they simply must pay.

I’ll be doing a webinar for CEF members on this topic tomorrow morning at 9am PT.

My talk on energy and the information economy at the Physics of Sustainable Energy conference on March 8th, 2014

I gave a talk on energy and the information economy at the Physics of Sustainable Energy Conference on March 8th, 2014.  The event was sponsored by the American Physical Society, which has been doing this conference for at least three cycles.  It was a great event, and I got to see many friends and colleagues, including Joanna Lewis, Bob Budnitz, Dan Kammen, Art Rosenfeld, Amory Lovins, and many others.  See below for the presentation.

My keynote interview with Joel Makower of GreenBiz from the Fall 2013 VERGE conference in San Francisco

GreenBiz just posted video of my keynote interview with Joel Makower of Green Biz from last fall’s VERGE conference in San Francisco, which took place on October 16, 2013.  The key message:  Information technology is our “ace in the hole” when it comes to facing the climate challenge.

In August 2013, Heather Clancy of GreenBiz did a text interview with me that explored some of these issues in more detail.

Watch it:

More on journalism in a scientific age

Joel Achenbach at the Washington Post has posted today a must-read article about pseudo science and journalism.  The final two paragraphs echo the findings in my recent post titled “Separating Fact From Fiction”:

There’s nothing at stake here except the survival of credible journalism. For those who are trying to figure out a business model for journalism — and I desperately want these folks to be successful — let me suggest that the ultimate killer app is quality. Quality comes in many forms. In the news business, being fast — ideally first — is a form of quality. Packaging the material in a beautiful way visually is another virtue. But the ultimate virtue in this business is getting it right.
I know that in turning this item into a screed I run the risk of declaring myself an insufferable fogey, and you can see me sprouting mutton-chop sideburns and wearing a monocle. I know, I know: There is no future in being boring. But getting it right, in the long run, will pay off. News executives should not assume that there is a digital gimmick, or technique, or facility with visuals, or dexterity with software, that will mask a deficit in comprehension and expertise. The audience is smarter than that. The audience will reward accuracy and intelligence. At least that’s what I believe — perhaps as matter of faith more than anything else.

Amen to that. in support of this salutary admonition, one of the commenters on the article cited the old journalism maxim, “Get it first.  But first, get it right”.

I did find myself startled by the idea that getting facts straight is somehow “boring”.  Who thinks this?  Anyone?  If so, shame on them.

Achenbach also points to the excellent Knight Science + Journalism Tracker at MIT, which documents how the media deals with scientific topics.

WNYC Radio show New Tech City asks whether we should delete more of our digital history

I was interviewed for the WNYC radio show New Tech City last week, and the story appeared today.  It’s a fun piece, and worth listening to.

My one quibble is that the report explicitly creates false balance, because it contrasted my data and analysis-based view of how much electricity information technology (IT) uses with the Greenpeace view that electricity used by IT is large and growing rapidly.  As I and Katie Fehrenbacher discussed previously, Greenpeace performed a valuable service by focusing attention on the carbon intensity of the electricity production feeding data centers, but their reports have had numerical inaccuracies in the past, and arguing (as the radio show does) that my views and the Greenpeace statements somehow result in a draw on the environmental front is as clear an example of false balance as I’ve been able to find.

In any case, the radio show raises some interesting questions, so have a listen  (Note:  I removed the embedded audio in this blog post, because it wasn’t playing properly).

Separating Fact From Fiction: A Challenge For The Media

This article is an abridged, re-edited, and amended version of an article that recently appeared as Koomey, Jonathan. 2014. “Separating Fact From Fiction:  A Challenge For The Media.” In IEEE Consumer Electronics Magazine. January. pp. 9-11.

“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
–Josh Billings

One of the less endearing features of the information age is the endless proliferation of attention-getting “factoids” that “just ain’t so.”  Take, for example, the amount of electricity associated with accessing the Internet through your smart phone.  A recent coal industry-funded study [1] claimed that the iPhone uses as much electricity as two refrigerators when you count the energy needed to make it, run it and power the “behind-the-wall” equipment to deliver data to the device.  Discussion of the original report (The Cloud Begins with Coal, hereafter CBC, written by Mark P. Mills) showed up on the Breakthrough Institute web site, Time Magazine Online, MSN News, the Huffington Post, MarketWatch, and Grist, among others (with most focusing on the comparison between a smart phone and one refrigerator [2]).

When I heard this claim, it took me back to the year 2000, when Mr. Mills and Peter Huber first made the claim [3] that the networking electricity for a wireless Palm VII exceeded the electricity for running a refrigerator. This claim, and the related ones about the total electricity used by computing and communications, turned out to be bunk, with the electricity used by a wireless Palm VII overestimated by a factor of 2000 [4, 5, 6, 7, 8, 9, 10, 11, 12, 13].  Not surprisingly, the new claim about smart phones and refrigerators also was a gross overestimate of the electricity associated with mobile phones [14, 15].

The danger, of course, is that policy makers and business planners will be misled by such claims, as they surely were last time, and make consequential mistakes [7].  The more important story, however, revolves around the credibility of the source and how the media treat him.  If scientists make claims that have been shown repeatedly in the peer-reviewed literature to be incorrect, they would normally retract their results and admit their errors.  That’s how science is supposed to work. If they fail to accept these results and continue to make the same misstatements without presenting empirical evidence to support them, they are effectively ostracized from the scientific community and aren’t taken seriously again.

Professional reputation is precious and perishable, and those who violate the scientific code of conduct face intellectual exile. It’s hard to regain your scientific reputation once it’s ruined, but the media world is different.  Like Mr. Mills, many people make incorrect statements but continue to get media attention even after their claims have been soundly refuted in the technical literature (Bjorn Lomborg, of Skeptical Environmentalist fame, is another archetypal example). It shouldn’t be that way, of course, but the media continue to fall into this trap.

Why does this keep happening? I can think of at least four reasons.

1) The Newness Filter:  Our current fast-paced media world prioritizes “newness” of information, with typical news cycles becoming ever shorter.  This bias towards publishing the most recent information as quickly as possible makes it difficult to get attention paid to in-depth refutations, which often take time to produce and detailed articles to present (by the time they are finished, the issue they address is already “old news”).  In addition, there’s a structural bias that works in favor of people creating incorrect factoids:  It’s much easier and quicker to fabricate factoids that look new, novel, and newsworthy than to debunk them (which often requires significant time, technical training, and careful peer review).
2) The Profit Motive:  The pressure on news media to make profits is often antithetical to careful reporting on technical issues.  Traditional newspapers have been in a death spiral for years, as the cash cow of classified advertising has been displaced by vastly more efficient online marketplaces, but most other media sources have been under pressure as well.  There are few dedicated science reporters nowadays, because training someone on those complex issues isn’t cheap, and media organizations cutting staff left and right just can’t afford that anymore.
Financial constraints also result in constant pressure to increase the number of readers or viewers (in large part to attract well-heeled advertisers), which means that news organizations become beholden (or at least more sympathetic) to the status-quo interests who have the money to advertise.   While this may not be a concern for some policy issues, any societal problem that threatens major status quo interests (like climate change) can’t fail to get short shrift when news organizations are so dependent on advertising from those very interests.
3) The Romance with Contrarians:  The environmentalist turned skeptic always gets a lot of attention. It’s an old and simple story, and one that in many journalists’ eyes confers credibility on the source.  But it’s also an easy storyline to fabricate, and it’s important for anyone reading work by self-proclaimed environmental contrarians to view their credentials and their claims with added skepticism.
4) The Quest for “Balance”:  The safest approach for journalists reporting on contentious topics is to stick to what different people say about the issue, writing “He said this, she said that.”  But what if there is really a right answer, and people making false and misleading arguments are in the pocket of powerful vested interests?  This has happened many times before, with cigarettes, lead, asbestos, and most recently on the climate issue [16].  The public needs someone (presumably journalists) to report when someone makes claims at odds with the current scientific consensus, so citizens can make an informed judgment.  Without such information, news stories convey false equivalence between claims that are technically accurate and those that aren’t.  That omission creates a bias towards preserving the status quo, one not based on evidence but on a cultural presupposition by journalists that “there are always two sides of a story”.

So most media outlets just don’t report sensibly about technical topics.  What can they do to fix this problem?

First, news organizations shouldn’t report on technical issues, no matter how delectable the factoid, unless the reporter and editor really understand the topic and have talked with the relevant experts.

Second, it’s important for journalists to understand that on many technical issues, there really are right and wrong answers, and that the safe and comfortable path (to create “balance”, also known by many outsiders as “he said, she said journalism”) is a one-way ticket to really messing up the story.  It may be appropriate for some political debates to report both sides in this way, but the public needs the media to sound the alarm when serial obfuscators are trying to pull the wool over their collective eyes.

Third, new business models for journalism that remove the profit motive may result in more accurate reporting on complicated issues, and these new approaches should be an active area of experimentation by those with the means to do so.

Finally, in this complex technological world, we need more journalists who are trained in technical disciplines.  How can we hope to address difficult issues like climate change when many journalists just aren’t able to evaluate technical claims with reliability?

The initial attention paid to Mr. Mills’ retread claims about iPhones and refrigerator electricity use provides a cautionary tale for a society struggling to deal with complex issues like climate change in the 21st century.  To face this challenge, we need to alter the way media reporting is conducted on technical issues. Here’s hoping we figure out how to make the necessary changes, and fast.  On climate, biotechnology, and many other technical topics, there’s simply no more time to waste.

About the author

Jonathan Koomey is a Research Fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University.  He’s the author of Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs, Turning Numbers into Knowledge:  Mastering the Art of Problem Solving, and more than 200 other books, articles, and technical reports. For a fun summary of iPhones vs. fridges smack down, see Adriene Hill’s radio story on NPR’s MarketPlace.

References

1.         Mills, Mark P. 2013. The Cloud Begins with Coal:  Big Data, Big Networks, Big Infrastructure, and Big Power–An Overview of the Electricity Used by the Global Digital Ecosystem. Digital Power Group.  August. [http://www.tech-pundit.com]

2.         Koomey, Jonathan. 2013. The Electricity Used by iPhones and Refrigerators, Take Two, August 25, 2013

3.         Huber, Peter, and Mark P. Mills. 2000. “Got a Computer?  More Power to You.” Wall Street Journal.  New York, NY.  September 7. p. A26.

4.         Kawamoto, Kaoru, Jonathan Koomey, Bruce Nordman, Richard E. Brown, Maryann Piette, Michael Ting, and Alan Meier. 2002. “Electricity Used by Office Equipment and Network Equipment in the U.S."  Energy–The International Journal (also LBNL-45917).  vol. 27, no. 3. March. pp. 255-269.

5.         Koomey, Jonathan G. 2000. Rebuttal to Testimony on ‘Kyoto and the Internet: The Energy Implications of the Digital Economy’. Berkeley, CA: Lawrence Berkeley National Laboratory. LBNL-46509.  August.

6.         Koomey, Jonathan. 2003. ”Sorry, Wrong Number:  Separating Fact from Fiction in the Information Age.“ In IEEE Spectrum. June. pp. 11-12.

7.         Koomey, Jonathan. 2008. Turning Numbers into Knowledge:  Mastering the Art of Problem Solving. 2nd ed. Oakland, CA: Analytics Press. [http://www.analyticspress.com]

8.         Koomey, Jonathan, Kaoru Kawamoto, Bruce Nordman, Mary Ann Piette, and Richard E. Brown. 1999. Initial comments on ‘The Internet Begins with Coal’. Berkeley, CA: Lawrence Berkeley National Laboratory. LBNL-44698.  December 9.

9.         Koomey, Jonathan, Chris Calwell, Skip Laitner, Jane Thornton, Richard E. Brown, Joe Eto, Carrie Webber, and Cathy Cullicott. 2002. ”Sorry, wrong number:  The use and misuse of numerical facts in analysis and media reporting of energy issues.“  In Annual Review of Energy and the Environment 2002. Edited by R. H. Socolow, D. Anderson and J. Harte. Palo Alto, CA: Annual Reviews, Inc. (also LBNL-50499). pp. 119-158.

10.       Koomey, Jonathan, Huimin Chong, Woonsien Loh, Bruce Nordman, and Michele Blazek. 2004. ”Network electricity use associated with wireless personal digital assistants.“  The ASCE Journal of Infrastructure Systems (also LBNL-54105).  vol. 10, no. 3. September. pp. 131-137.

11.       Romm, Joe, Arthur Rosenfeld, and Susan Herrmann. 1999. The Internet Economy and Global Warming. Washington, DC: Center for Energy & Climate Solutions.

12.       Romm, Joe. 2002. ”The Internet and the new energy economy.“  Resources, Conservation, and Recycling.  vol. 36, no. 3. October. pp. 197-210.

13.       Roth, Kurt, Fred Goldstein, and Jonathan Kleinman. 2002. Energy Consumption by Office and Telecommunications Equipment in Commercial Buildings–Volume I:  Energy Consumption Baseline. Washington, DC: Prepared by Arthur D. Little for the U.S. Department of Energy. A.D. Little Reference no. 72895-00.  January.

14.       Koomey, Jonathan. 2013. Wild Claims about Electricity Used by Computers that Just Won’t Die (But Should), August 19, 2013

15.       Koomey, Jonathan. 2013. Does Your iPhone Use As Much Electricity As A New Refrigerator? Not Even Close, Climate Progress, 2013

16.       Oreskes, Naomi, and Eric M. Conway. 2010. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury Press.

Our panel at the Open Compute Summit: Bringing integrated design, mass production and learning by doing to the data center industry

Last week I moderated a panel at the Open Compute Summit V in San Jose that focused on bringing integrated design, mass production and learning by doing to the data center industry.  The great irony is that information technology (IT) has had amazing success at transforming other industries, reducing costs and increasing the speed of innovation, but most enterprise IT is still provisioned using archaic rules of thumb and decades old institutional arrangements.  The panel explores potential solutions to that problem.  My fellow panelists were Jim Stogdill at O'Reilly, Kushagra Vaid of Microsoft, and Sherman Ikemoto of Future Facilities.

Another example of why strong environmental regulations are needed

The chemical spill that fouled West Virginia’s water supply provides yet another example (as if any more were needed) showing why unregulated markets will underinvest in environmental protection.

The Wall Street Journal has a summary here (subscription required)

Freedom Industries Inc., the company connected to a chemical spill that tainted the water supply in West Virginia, on Friday filed for Chapter 11 bankruptcy protection.
A bankruptcy petition signed by the company’s president, Gary Southern, estimates Freedom’s debts at $10 million or less, but the cost of disaster is likely to run much higher.
Thousands of gallons of an allegedly toxic chemical called crude MCHM contaminated the water supply for hundreds of thousands of the state’s residents for days, spawning lawsuits from businesses and people affected by the disaster.

If the company is found liable for damages from the spill, bankruptcy protection prevents the owners from being held fully accountable for damages to others.  The result is that society will end up paying for the costs, so profits have been privatized, and losses socialized.  In addition, companies that behave responsibly and institute best practices  will be penalized by higher costs relative to less responsible companies.

If there are regulations, then everyone in the marketplace has to pay the costs, so no one is disadvantaged.   There can be no clearer argument for why government regulation of environmental damages is needed.  Otherwise there will be a race to the bottom that results in society paying the price.

There are other important issues.  Consider this sentence that occurs later in the WSJ story:

As the chemical isn’t regulated, there are “no published standards” for acceptable levels of concentration in water, according to Freedom.

According to Wired Science, the chemical involved in the spill was

one of 62,000 industrial compounds grandfathered in with passage of the 1976 Toxic Substances Control Act. Basically that meant that no further testing was required; the assumption was that these chemicals, which apparently hadn’t killed anyone yet, were unlikely to do so.

A recent New Yorker story also lays out more details about the lack of oversight, writing

It was, apparently, no one’s job to regularly monitor Freedom Industries’ tanks along the Elk, even though state officials knew that hazardous chemicals were sitting near the West Virginia American Water intake.

This points to fundamental problems with current US regulation of chemicals.  First, many chemicals have been grandfathered, but we have no idea whether these chemicals are toxic or carcinogenic (because no one has an incentive to test them).  Second, we assume that chemicals are safe unless proven otherwise, instead of testing any chemicals before we put them into widespread use.  This assumption leads to greater costs (because manufacturers have to pull products from the shelves if found to be problematic, as they did for BPA in children’s products) but it also will lead to harm that may go undiscovered for years or decade, since the onus is on people with concerns to prove harm, instead of making the chemical companies show safety before putting chemicals on sale.    Finally, the existing regulations on such chemicals are fragmented, incomplete, and often nonexistent, so substantial reforms are needed.

It’s time to change our fundamental assumptions about chemicals, so that all such materials must be tested before they put into products or used for industrial processes.  It’s also time to work through that massive backlog of grandfathered chemicals and figure out which of them are safe and which aren’t, then follow up with appropriate regulations.  And it’s time to make sure all toxic substances are regulated effectively, and that none fall through the cracks.

These conclusions imply that we’ll need government action to fix these environmental problems, and that the “government is the problem” crowd will need to be beaten back, yet again.  As I wrote in Cold Cash, Cool Climate (as well as this blog post)

What we need is an honest discussion about what kind of government we want and what we want it to do for us.  Sometimes we’ll want more government, like when we find lead in children’s toys, salmonella in peanut butter, poison in medicines, an unsustainable health care system, or fraudulent assets and a lack of transparency in the financial world.  We know from experience that only government can fix those things. Sometimes we’ll want less government, like when old and conflicting regulations get in the way of starting innovative new companies. Only government can fix that too (although the private sector has some lessons to teach on that score). And sometimes we’ll want the same government, just delivered more efficiently (like the state of California has done with the Department of Motor Vehicles in recent years, the good results of which I’ve experienced firsthand).
When it comes to government, more is not better. Less is not better.  Only better is better. And better is what we as a society should strive for.

It is strange that more than fifty years after Rachel Carson’s Silent Spring, we still haven’t fundamentally reformed how we treat chemicals we put in the environment or in our bodies.  It’s long past time to fix that.

Addendum:  What I’ve suggested above amounts to changing our expectations about property rights, which is a topic I explored in detail in Cold Cash, Cool Climate and in this blog post.  Government defines property rights and those rights can (and should) evolve over time as the economy changes.  There’s nothing radical about making such a change, and doing so in a sensible way would allow society to move towards a more sustainable society with minimal disruption to the economy.  The end result would be a society where the costs of environmental damage are borne more completely by the creators of that damage, which is both efficient and just.

Addendum 2:  Climate Progress gives more examples of how companies can use bankruptcy to avoid accountability for environmental damages.

The new omnibus spending agreement has at least three provisions that are bad for the climate

The National Journal (among others) is reporting that

The $1 trillion federal spending bill that lawmakers unveiled Monday night would soften an Obama administration climate-change policy that greatly restricts U.S. support for coal-plant construction in developing nations…
The bill also targets a policy to phase out inefficient light bulbs in the U.S. that was contained in a bipartisan 2007 energy law but has since fallen out of favor with conservatives.

More details are at Wonkblog and Environment and Energy Daily (which also reports that language preventing the Obama Administration from changing the rules on mountaintop mining was inserted into the bill–that’s another blow against efforts to internalize the massive external costs associated with coal mining, extraction, and use).

The lighting standards have been a political football for awhile, and the National Journal correctly notes that manufacturers are phasing in the new bulbs rapidly regardless (they strongly support the standards, as do efficiency and consumer groups).

As is clear to anyone who’s paying attention, we’re running out of time to act on the climate issue.   Stopping coal exports and the construction of new coal plants, both at home and abroad, should be at the top of our list.

The Administration’s efforts to stop international support for coal plants is one of the few bright spots in the efforts made on climate in the past few years, and it’s disappointing that the people negotiating the spending bill sacrificed that, the possibility of stricter mining regulations, and the lighting standards.  These latest developments indicate that the political class (even the folks who supposedly care about the climate issue) really don’t understand the urgency of the problem.

Update:  Climate Progress has posted a comprehensive summary of the provisions in the omnibus bill affecting energy and environment.

Moving beyond benefit-cost analysis of climate change

My invited perspective article, “Moving beyond benefit-cost analysis of climate change”, was just posted by the open access on-line journal Environmental Research Letters.  Here’s the abstract and the introduction.

Abstract
The conventional benefit–cost approach to understanding the climate problem has serious limitations. Fortunately, an alternative way of thinking about the problem has arisen in recent decades, based on analyzing the cost effectiveness of achieving a normatively defined warming target. This approach yields important insights, showing that delaying action is costly, required emissions reductions are rapid, and most proved reserves of fossil fuels will need to stay in the ground if we’re to stabilize the climate. I call this method ‘working forward toward a goal’, and it is one that will see wide application in the years ahead.
Introduction
The recent article in ERL by Luderer et al [1] is exemplary in the clarity of its approach and the cogency of its recommendations, demonstrating that further dithering on the climate issue increases the costs of mitigation and makes it more difficult to achieve climate stabilization. Its findings are compelling in large part because it uses an approach to assessing the climate problem that diverges from the usual benefit–cost framing (which purports to characterize marginal costs and benefits of climate action far into the future, as, for example, in Nordhaus [2]). That divergence is all to the good, because the benefit–cost approach, while it has been useful in many contexts, has serious limitations that call into question its utility for analyzing climate change [38].1
This new way of thinking, which I call 'working forward toward a goal’, involves assessing the cost effectiveness of different paths for meeting a normatively determined target. It has its origins in the realization that stabilizing the climate at a certain temperature (e.g., a warming limit of 2  Celsius degrees above pre-industrial times) implies a particular emissions budget, which represents the total cumulative greenhouse gas emissions compatible with that temperature goal. This approach had its first fully developed incarnation in 1989 in Krause et al [9] (which was subsequently republished in 1992 [10]). It was developed further in Caldeiraet al [11] and Meinshausen et al [12], and has recently served as the basis for the International Energy Agency’s analysis of climate options for several years running [1315].
Such an approach has many advantages. It encapsulates our knowledge from the latest climate models on how cumulative emissions affect global temperatures, placing the focus squarely on how to stabilize those temperatures. It puts the most important value judgment up-front, embodied in the normatively determined warming limit, instead of burying key value judgments in economic model parameters or in ostensibly scientifically chosen concepts such as the discount rate. It gives clear guidance for the rate of emissions reductions required to meet the chosen warming limit, thus allowing us to determine if we’re 'on track’ for meeting the ultimate goal, and allowing us to adjust course if we’re not hitting those near-term targets. It also allows us to estimate the costs of delaying action or excluding certain mitigation options, and provides an analytical basis for discussions about equitably allocating the emissions budget. Finally, instead of pretending that we can calculate an 'optimal’ technology path based on guesses at mitigation and damage cost curves decades hence, it relegates economic analysis to the important but less grandiose role of comparing the cost effectiveness of currently available options for meeting near-term emissions goals.
'Working forward toward a goal’ is a more business-oriented framing of the climate problem [4]. It mirrors the way companies face big strategic challenges, because they know that accurately forecasting the future of economic and social systems is impossible [16, 17], so they set a goal and figure out what they’d have to do to meet it, then adjust course as developments dictate. To do so, they implement many different options, evaluate continuously, and do more of what works and less of what doesn’t. Such an approach, which the National Research Council [18] dubs 'iterative risk management’, recognizes the limitations of economic models and frees us from the mostly self-imposed conceptual constraints that make it hard to envision a future much different from the world as it exists today [4].

Read more…

This article addresses the fundamental problem in the conventional benefit-cost analysis framing for the climate issue, that it’s really impossible to calculate benefits and costs decades hence.  In addition, it presents the historical roots of the emerging consensus about how to analyze this problem, showing that the basic storyline hasn’t changed much since we published the first comprehensive analysis of a 2 C warming limit in 1989 (warning:  file is more than 100 MB).  For more on this basic line of argument, see my most recent book, Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs.

The full reference is

Koomey, Jonathan. 2013. “Moving Beyond Benefit-Cost Analysis of Climate Change."  Environmental Research Letters.  vol. 8, no. 041005. December 2. [http://iopscience.iop.org/1748-9326/8/4/041005/]

Cold Cash, Cool Climate a finalist in the 2013 USA Best Book Awards

Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs was recently honored as a finalist in the Science: General category of the 2013 USA Best Book Awards.  To see the list of 2013 winners, go here.

In May of this year, the book was awarded an honorable mention in the 2013 Eric Hoffer awards, in the E-book nonfiction category.  It was also a  Finalist in the ‘Business: Entrepreneurship & Small Business’ category of the 2013 International Book Awards.

image

Our journal article titled "Smart Everything" is now available for download

The Annual Review of Environment and Resources has just released their Volume 38, which contains our latest article, titled “Smart Everything:  Will Intelligent Systems Reduce Resource Use? (CLICK ON THE TITLE LINK TO DOWNLOAD A FREE COPY!)”.  The article provides a broad look at the progress of distributed mobile computing and communications in transforming economic activity and resource use.  It’s a very high quality journal (impact factor = 4.968) and I’m honored to have been commissioned to write this article, in collaboration with my colleagues Scott Matthews of Carnegie Mellon University and Eric Williams of the Rochester Institute of Technology.  Please send comments and suggestions for interesting new applications of information technology, as I’m compiling these examples for some upcoming research.

Here’s the abstract:

Until recently, the main environmental concerns associated with information and communication technologies (ICTs) have been their use-phase electricity consumption and the chemicals associated with their manufacture, and the environmental effects of these technologies on other parts of the economy have largely been ignored. With the advent of mobile computing, communication, and sensing devices, these indirect effects have the potential to be much more important than the impacts from the use and manufacturing phases of this equipment. This article summarizes the trends that have propelled modern technological societies into the ultralow-power design space and explores the implications of these trends for the direct and indirect environmental impacts associated with these new technologies. It reviews the literature on environmental effects of information technology (also with an emphasis on low-power systems) and suggests areas for further research.

Koomey, Jonathan G., H. Scott Matthews, and Eric Williams. 2013. “Smart Everything:  Will Intelligent Systems Reduce Resource Use?"  The Annual Review of Environment and Resources.  vol. 38, October. pp. 311-343. [To access the article’s page on the AREE site, go to http://www.annualreviews.org/doi/abs/10.1146/annurev-environ-021512-110549]

The journal allows readers to download one copy of the article for their own personal use by clicking on the URL that’s embedded in the title of the article referenced above.  Here’s the legalese:

I am pleased to provide you complimentary one-time access to my Annual Reviews article as a PDF file. This file is for your own personal use. Any further/multiple distribution, publication, or commercial usage of this copyrighted material requires submission of a permission request addressed to the Copyright Clearance Center (http://www.copyright.com/)

The value of one watt of savings from more efficient IT equipment in a data center

In a white paper I wrote for Samsung last fall (Koomey 2012), I analyze the economics of purchasing green DRAM for servers.  Assessing the economics of efficiency improvements in IT equipment requires knowledge not just of energy prices and energy savings, but also of the avoided infrastructure costs associated with lower power computing.

For new facilities, this is a real avoided cost, but for existing facilities that are capacity constrained it’s an opportunity cost–electricity that powers unnecessarily electricity-intensive DRAM chips could have been used instead to power another server that generates useful work.  It’s only in facilities that are not capacity constrained (which are the vast minority, based on anecdotal evidence) where this opportunity cost is not relevant.  There are large variations in those infrastructure costs depending on reliability requirements and type of data center, but it’s still important to understand the rough orders of magnitude for analyzing the economics of improving the energy efficiency of computing equipment in those facilities.

I summarize the results from these calculations in the following graph.  The key result is that avoided infrastructure savings represent more than half of the economic savings associated with reducing computing electricity use in the data center (under plausible assumptions for the relevant parameters in different types of data centers).  These economic benefits really matter, and analyses that don’t count them will mislead by substantially underestimating the benefits from efficiency improvements in computing equipment.

What’s one watt of electricity savings worth in the data center?

image

Notes:  Infrastructure capital savings apply to new construction or existing facilities that are power/cooling constrained.  Those savings total $8.6M/MW for cloud facilities and $15M/MW for others, from Uptime institute.  PUE = 1.1, 1.5, and 1.8 for Cloud, New, and Existing data centers, respectively.  Electricity price =$0.039/kWh for cloud facilities and $0.066/kWh for new/existing data centers. All costs in 2012 dollars.

Avoided infrastructure costs for cloud computing case reflect the lower bound of the 2nd quartile from Stanley and Schafer 2012.   For typical data centers, avoided infrastructure costs reflect the median value from Stanley and Schafer.  Avoided infrastructure costs only apply in existing facilities when they are power and/or cooling constrained.

Here are some more details on the Uptime estimates of infrastructure costs, summarized in the green DRAM report.

The cost of data center infrastructure is commonly summarized in millions of dollars per MW of IT load (which is equivalent to dollars per watt).  According to data compiled by 451 Research (Stanley and Schafer 2012), conventional data centers range in cost from about $5M to $29M/MW (2012 dollars), with 50% of the data centers falling between $8.6M and $18.8M/MW.  The median of their sample is at $15M/MW.
After conversations with the first author of the 451 Research report (John Stanley), I decided to use the bottom of the 2nd quartile ($8.6M/MW) as the avoided cost for Cloud Computing installations.   Stanley was concerned that data centers with costs at the lowest end of the range ($5M/MW) might not be comparable in reliability or other important characteristics with the high performance data centers that make up the vast majority of the 2nd and 3rd quartiles of the distribution.   I also decided to use the 451 Group’s median numbers ($15M/MW) as the value for Typical Existing and Recent Practice Facilities because “in-house” data center facilities are the ones most heavily represented in that report’s data sample.

So if you read or create analyses of the economic benefits of improving the efficiency of computing equipment, make sure they correctly account for the avoided infrastructure costs, which are substantial even in the most efficient data centers in the world.

References

Koomey, Jonathan G. 2012. The Economics of Green DRAM in Servers. Burlingame, CA: Analytics Press.  November 2. [http://www.mediafire.com/view/uj8j4ibos8cd9j3/Full_report_for_econ_of_green_RAM-v7.pdf]

Stanley, John, and Jason Schafer. 2012. The Economics of Prefabricated Modular Datacenters. San Francisco, CA: The 451 group/451 research.  May 11. [https://www.451research.com/report-long?icid=2266].

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute