This 1991 Shell documentary on climate still holds up well

The oil industry has known about the climate problem for a long time, which makes their resistance to actions to reduce emissions somewhat problematic (that’s an understatement).  This video from Shell dated 1991 is exemplary in how it treats climate science, and it stands up really well.  That speaks to just how well established climate science really is.

We knew the outlines of the climate problem in the late 1980s when we wrote the first comprehensive treatment of a 2 C warming limit (warning, 100+MB file!). The intervening decades have filled out the picture, but the basic story remains the same.

I highly recommend this 1991 video, especially for those who are skeptical about climate science.  Shell lays out the case for climate action really well, although the intervening quarter century of technological progress has opened up new options for reducing emissions that could only be imagined in 1991.

References

Krause, Florentin, Wilfred Bach, and Jon Koomey. 1989. From Warming Fate to Warming Limit:  Benchmarks to a Global Climate Convention. El Cerrito, CA: International Project for Sustainable Energy Paths.

The 1989 book was republished in 1992 as Krause, Florentin, Wilfred Bach, and Jonathan G. Koomey. 1992. Energy Policy in the Greenhouse. NY, NY: John Wiley and Sons.

In Memoriam:  Arthur H. Rosenfeld, Eminent physicist, inspirational researcher, energy efficiency maven

image

Photo: Chris Calwell (left) and Jonathan Koomey (right) in 2010 presenting Art Rosenfeld with recognition of a new unit coined in his honor, in a peer reviewed article in Environmental Research Letters.  One Rosenfeld equals savings of 3 billion kWh/year (at the meter) and reductions of 3 million metric tons of carbon dioxide per year, equivalent to avoiding the need for a typical 500 MW coal plant.

______________________________________________________________________

Arthur Hinton Rosenfeld passed away peacefully early today, January 27, 2017.  He was 90 years old.

Over the course of his career he inspired thousands of students, post-docs, and other researchers to make the world a better (and more efficient) place, and motivated policy-makers to adopt these ideas with a combination of personal charm and convincing analysis. His quick wit, enthusiasm, and unrivaled personal energy made him a beloved figure in the world of energy efficiency policy and technology.

Even when expressing controversial ideas, he did it in disarming and often whimsical way, without putting his ego in the debate. He communicated a sense of wonder and innocence, all the while recognizing the importance of getting the numbers right. He unerringly identified the right questions to ask about the right topics, and had the persistence to take research results all the way to advocacy that had real societal impact. And he did it with a friendly and collegial charm that is reflected in the fact that his students referred to him as “Art” rather than the expected “Professor Rosenfeld”.

Born in Alabama on June 22, 1926, Art spent his childhood years in Egypt, where his father was a consultant to the Egyptian sugarcane industry.  He graduated with a B.S. in physics at age 18, enlisted in the Navy towards the end of the war, and afterwards enrolled in the Physics Department of the University of Chicago, where Enrico Fermi accepted him as his last graduate student.

Art married Roselyn Bernheim in 1955. They had three children, Margaret, Anne, and Art junior (Chip).

After receiving his Ph.D. in Physics in 1954, Rosenfeld joined the physics faculty at the University of California at Berkeley, where he worked in, (and from 1969 to 1974, led) the particle physics group (“Group A”) of subsequent Nobel Prize winner Luis Alvarez at Lawrence Berkeley National Laboratory (LBNL).

The oil embargo of 1973 galvanized Art; and he began asking endless questions. Why were Bay Area offices all brightly lit at 2AM when nobody was there? Why were California home-heating bills comparable to those in Minnesota?  Why were utilities giving away free 200-watt electric light bulbs? And why were the then popular Eichler Homes using electric resistance heating with no roof insulation?  For what activities, and in what devices, was the US consuming energy? And what where the physics-based limits for how little energy these activities really needed?

These and other questions led Art and several of his colleagues to frame the energy problem as “How to accomplish society’s goals most efficiently and cheaply” rather than “How to supply enough energy.”  This reframing was revolutionary in the era that most people thought energy consumption and economic growth always increased in lockstep.

Following a yearlong “sabbatical” from particle physics, Professor Rosenfeld decided to continue working on the efficient use of energy, mainly in buildings.  He eventually founded the Center for Building Science at LBNL, which he led until 1994. Art attracted a cadre of talented, creative, and energetic people to LBNL in the 1970s and early 1980s, and these leaders helped Art build a world-class center for energy and environment studies.   The center also inspired a small army of students at UC Berkeley to focus on energy efficiency, and these researchers helped build the energy efficiency industry once they left the university.

Art’s contributions to the fledgling knowledge base of building science were seminal, and he is widely considered the father of energy efficiency. The Center for Building Science developed a broad range of energy efficiency technologies, including electronic ballasts for fluorescent lighting—a key component of compact fluorescent lamps (CFLs)—and a transparent coating for window glass that blocks heat from either escaping (winter) or entering (summer). He was personally responsible for developing the DOE-2 series of computer programs for building energy analysis and design that has been the gold standard for building energy analysis for more than 25 years.

Art’s work quickly took him into the policy arena. In 1975, Utilities had selected sites, and requested permits for 17 GW of power plants to come online by 1987.  But long before 1987, all but 3 GW had been quietly forgotten. An even more extravagant report by Ronald Doctor of the RAND in Santa Monica had projected need for 150 GW of new power plants for California by 2000, which would put one GW of power plants every 3 miles along the coast between San Diego and San Francisco. Art worked with legislators, regulators and the then new California Energy Commission to implement much less-expensive efficiency policies that made those plants superfluous. California’s peak demand has been held to 60 GW today. So in retrospect, we have avoided at least $75 billion in wasted investment.

Art was the co-founder of the American Council for an Energy Efficiency Economy (ACEEE), and the University of California’s Institute for Energy and the Environment (CIEE). He was the author or co-author of over 400 refereed publications or book chapters.

During the Clinton administration Art served from 1994 through 1999 as Senior Advisor to the U.S. Department of Energy’s Assistant Secretary for Energy Efficiency and Renewable Energy. He also served as Commissioner at the California Energy Commission (CEC), after California Governor Gray Davis appointed him in 2000. He was reappointed in 2005 by Governor Arnold Schwarzenegger.

In 2010 he returned to LBNL and was elected to the National Academy of Engineering. In that same year he was appointed Distinguished Scientist Emeritus at LBNL. Until his death he devoted his attention to an international campaign for the adoption of white roofs and “cool colored” surfaces to reduce heat islands and mitigate global warming

His many awards and honors include the Szilard Award for Physics in the Public Interest (1986), the U.S. Department of Energy’s Carnot Award for Energy Efficiency (1993), the University of California’s Berkeley Citation (2001), the Global Energy Prize from President Medvedev of Russia (2011), the National Medal of Technology and Innovation from President Obama (2013), and the Tang Prize for Sustainable Development (2016).

When friends asked him what he does for relaxation, Art used to say “relaxing makes me nervous”.  He did enjoy going jogging every weekend, particularly with his children.

Of all his prizes he was most proud of the Enrico Fermi Award in 2006, the oldest and one of the most prestigious science and technology awards given by the U.S. government and named for his mentor. Dr. Rosenfeld received the Fermi Award from Energy Secretary Samuel W. Bodman on behalf of President George W. Bush, “for a lifetime of achievement ranging from pioneering scientific discoveries in experimental nuclear and particle physics to innovations in science, technology, and public policy for energy conservation that continue to benefit humanity.” This award recognizes scientists of international stature for a lifetime of exceptional achievement in the development, use, control, or production of energy.

Professor John Holdren, director of White House Office of Science and Technology Policy under President Obama says, “Art Rosenfeld had an enormous impact on U.S. energy policy, starting in the early 1970s, with his insights and compelling quantitative analyses pointing to the potential of increased end-use efficiency as the cheapest, cleanest, surest response to the nation’s energy challenges.”

Dr. Rosenfeld is survived by daughters Dr. Margaret Rosenfeld and Dr. Anne Hansen, two granddaughters and four grandsons, as well as the entire energy efficiency community.

Acknowledgement

This article was prepared by Art Rosenfeld’s former graduate students and longtime friends and admirers, Ashok Gadgil, David B. Goldstein, and Jonathan Koomey.

Additional Information

To learn more about Art Rosenfeld’s life and career, go here.

Any gifts in Art’s memory are to be made to the Global Cool Cities Alliance. (www.gobalcoolcities.org)

New study from LBNL on NREL analyzing costs and benefits of renewable portfolio standards

image

Last month, LBNL and NREL released an important study analyzing costs and benefits of renewable portfolio standards.  This study is a comprehensive look not just at the costs of implementing renewable energy to meet those standards but the benefits that accrue from doing so.

Those benefits include reduced greenhouse gas emissions, reduced criteria pollutant emissions (which cause human health impacts), reduced wholesale power prices, reduced natural gas prices, reduced water use.

The figure below summarizes the benefits and costs, as well as other impacts that are not monetized.

Some key findings:

•   Existing RPS policies are roughly neutral on electricity system costs, while being more likely than not to reduce electricity prices.

•   Whatever costs might accrue for existing RPS policies even at the high end are offset tenfold by reduced natural gas prices, criteria pollutant emissions, and greenhouse gas emissions.

•   In the high renewable energy case, costs in even the highest electric system cost case are offset more than six fold by reduced natural gas prices, criteria pollutant emissions, and greenhouse gas emissions.

Many serial disinformers (like Robert Bryce and Bjorn Lomborg) claim that renewable energy is more expensive than conventional fossil electricity generation, but as this study shows, that claim is no longer true even when just considering direct system costs for existing RPS policies.

When we include the benefits associated with reduced pollution and natural gas prices (as we should when assessing costs from the societal perspective) renewable energy is a clear winner.  Society is far better off by implementing more renewable electricity generation, and with costs of utility solar, building sector solar, and wind dropping dramatically in recent years, that conclusion will only strengthen over time.

Fossil fuels are only cheap when you don’t count all the costs.  When you do your sums correctly, renewables are far cheaper from society’s perspective, and in many cases cheaper in direct cost terms.  Fossil fuel fired electricity generation is living on borrowed time.

Read more here.

Our new report for the Risky Business project, “From Risk to Return”, out today!

image

For the past two years I’ve been working with a distinguished team of analysts associated with the Risky Business project to analyze possible pathways to substantially reducing greenhouse gas emissions in the US.  Our new report, From Risk to Return: Investing in a Clean Energy Economy, just came out today.

Here’s the first part of the executive summary:

In our 2014 inaugural report, “Risky Business:
The Economic Risks of Climate Change in the United States,” we found that the economic risks from unmitigated climate change to American businesses and long-term investors are large and unacceptable. Subsequent scientific data and analysis have reinforced and strengthened that conclusion. As a result, we, the Co-Chairs and Risk Committee of the Risky Business Project, are united in recognizing the need to respond to the risk climate change poses to the American economy.

Now we turn to the obvious next question: how
to respond to those risks. Seriously addressing climate change requires reducing greenhouse gas emissions by at least 80 percent by 2050 in the U.S. and across all major economies. We find that this goal is technically and economically achievable using commercial or near-commercial technology. Most important, we find that meeting the goal does not require an energy miracle or unprecedented spending.

The transition to a cleaner energy economy rests on three pillars: moving from fossil fuels to electricity wherever possible, generating electricity with low or zero carbon emissions, and using energy much more efficiently. This means building new sources of zero- and low-carbon energy, including wind, solar, and nuclear; electrifying vehicles, heating systems, and many other products and processes; and investing in making buildings, appliances, and manufacturing more energy efficient.

Meeting these targets requires a large-scale
shift away from ongoing spending on fossil fuels and toward up-front capital investments in clean energy technologies. Many of those, such as
wind and solar, have little or no fuel cost once built. Given an appropriate policy framework, we expect these investments to be made largely by the private sector and consumers, and to yield significant returns. Because of the large capital investments and the long-term savings in fuel costs, this shift presents significant opportunities for many American investors and businesses. Notably, shifting the U.S. to a low-carbon, clean energy system presents not just long term benefits but also immediate, near-term opportunities, particularly for those actors best positioned to capitalize on these trends.

Since I started analyzing greenhouse gas mitigation options in the late 1980s, the default assumption has been to use the business-as-usual trends for consumption by fuel and not change them much as we searched for emissions reductions options.  What has always been true is that the options for reducing emissions in the electricity sector have been cheaper and more plentiful than those for industry or transportation, and that continues to be true.

The Deep Decarbonization analysis from E3 (which is the analysis framework on which we built From Risk to Return) was one of the first to show that the ease of reducing emissions from the electricity sector created an opportunity. If society engages in large scale electrification of most end-uses in the economy at the same time as we improve efficiency and decarbonize the electric grid, much larger emissions reductions become possible.  And that’s the framework that led to the findings of today’s report.

Download the new report here.

Our latest on energy efficiency of computing over time, now out in Electronic Design

My colleague Sam Naffziger (AMD) and I just published our latest article titled “Energy efficiency of computing: What’s next?” in the magazine Electronic Design.  Here’s the abstract, which didn’t make it into the actual online article:  

Today’s computing systems operate at peak output a tiny fraction of the year, so peak output energy efficiency (which has slowed since the turn of the millennium) is not the most relevant efficiency metric for such devices. The more important question is whether computing efficiency in idle and standby modes (which are more representative of “typical use”) can be improved more rapidly than can peak output efficiency.  This article demonstrates that in the past eight years, the answer to that question has been a resounding yes, and we expect those more rapid efficiency improvements for computers in typical use to continue for at least the next few years.

Our original work (2011) on efficiency trends showed the energy efficiency of computing had doubled every 1.6 years since the beginning of the computer age:

Koomey, Jonathan G., Stephen Berard, Marla Sanchez, and Henry Wong. 2011. “Implications of Historical Trends in The Electrical Efficiency of Computing.”  IEEE Annals of the History of Computing.  vol. 33, no. 3. July-September. pp. 46-54. [http://doi.ieeecomputersociety.org/10.1109/MAHC.2010.28]

In that work, I didn’t examine the post-2000 period in detail.  When I re-analyzed the 2011 data, I found that peak output efficiency had slowed after 2000, with a doubling time of 2.6 years.  That result makes sense, because Dennard scaling ended in 2000 or so.  Figure 1 in our new article shows the effect of that change.

image

When AMD approached me with new data, I leapt at the chance to see what trends were implied in those data.  As Figure 1 shows, their trend in peak output efficiency from 2008 to 2016 lies almost exactly on the trend I found in our 2011 data.

The key insight of the new article is that there are different measures of efficiency, and that a focus on peak output efficiency is not as appropriate for many types of computing devices (whose energy use is dominated by long periods of idle, standby, and sleep).  We show that a focus on what we call “typical use” efficiency reveals more rapid improvements than are evident in peak-output efficiency in the 2008 to 2016 period, as shown in Figure 2.

image

This new article is an expanded look at the data we first put forth in IEEE Spectrum last year:

Koomey, Jonathan, and Samuel Naffziger. 2015. “Efficiency’s brief reprieve:  Moore’s Law slowdown hits performance more than energy efficiency.” In IEEE Spectrum. April. [http://spectrum.ieee.org/computing/hardware/moores-law-might-be-slowing-down-but-not-energy-efficiency]

I summarize our latest work on computing efficiency here:  http://www.analyticspress.com/computingefficiency.html

If you’d like a copy of the original 2011 article or the Electronic Design article with the related appendices, please email me.  The full reference is:

Koomey, Jonathan, and Samuel Naffziger. 2016. “Energy efficiency of computing:  What’s next?” In Electronic Design. November 28. [http://electronicdesign.com/microprocessors/energy-efficiency-computing-what-s-next]

My review of Mann and Toles, The Madhouse Effect: How Climate Change Denial is Threatening our Planet, Destroying our Politics, and Driving Us Crazy

The sterling reputations of Michael E. Mann and Tom Toles precede them, and in The Madhouse Effectthey do not disappoint.  I confess that I read all the cartoons first (I’m a big Tom Toles fan).  Then I dug in to the text, and found it equally enjoyable.

The writing is brilliantly clear and concise.  The science is unfailingly accurate.  And the cartoons add an immediacy, accessibility, and passion to the book that “normal” books about the climate problem usually lack.

There are many excellent treatments of climate science for lay people, but most talk about climate science without explaining first what science is and what we can reasonably expect of it.  Chapter 1, titled “Science:  How it Works”, takes on that challenge (see also Chapter 4 in my 2008 book, Turning Numbers into Knowledge: Mastering the Art of Problem Solving, titled “Peer Review and Scientific Discovery”).

Chapter 1 is critically important to understanding what I see as the key purpose of this book.  The campaign of denial and deceit against climate science is an attack on rational thinking and scientific inquiry more generally, and this book is a counterattack against that effort for a non-technical audience.

Others have made this case, notably Naomi Oreskes in Merchants of Doubt, and Mann and Toles echo and support Oreskes’ arguments, with additional context and color from Professor Mann’s experience as a practicing climate scientist who has faced down the deniers on more than one occasion (even prevailing in court).

Mann and Toles also don’t shrink from naming names, and this is one of the most important contributions of the work, particularly for practicing journalists.  The rogue’s gallery of deniers and delayers is a who’s who of people who journalists shouldn’t cite on this topic (or probably any topic).  Their credibility is shot in the scientific community, and they should be treated as the cranks and crackpots that they are.

This assessment sounds harsh, but most of these bad actors have been at this game for decades, and their strategy is one that they’ve used many times before.  Here’s how I summarized it in Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs:

The supporters of the deniers follow a particular strategy, one that was well honed by the corporate responses to various public health and environmental issues, as documented by Naomi Oreskes and others.  They make excuses that parallel the high level talking points summarized at Skeptical Science:  

It’s not a problem.

If it is a problem, we didn’t cause it.

Even if we caused it, fixing the problem would be too expensive and cost too many jobs.

These are exactly the same points industries used in fighting government action on cigarettes, asbestos, seat belts, air bags, lead in paint and gasoline, catalytic converters, ozone depletion, acid rain, and any number of other related issues, and we need to start treating it as a deliberate strategy instead of just a legitimate line of argument to be analyzed and assessed in isolation.  That doesn’t mean industry will never raise real issues about whether and how to regulate a particular environmental problem, just that we should be more than a little skeptical whenever we hear this self-serving way of framing issues. It is especially important for members of the news media to understand this tactic, because they often unwittingly serve as megaphones for industry arguments of this form.  If they realized that this strategy is a deliberate one, they might be a bit more careful in how they characterize these stories.

Those of us who’ve been studying climate science and solutions for decades have grown weary of the deniers being treated as serious contributors to the debate.  This book makes a strong case for voting them off the island.

The only minor criticism I’d raise of The Madhouse Effect is that the treatment of the economic case for rapid climate mitigation is a bit less strong than I’d prefer, but Professor Mann isn’t an economist or a technologist, so this isn’t really surprising.  Perhaps for his next popular treatment he’ll bring in a collaborator to take a more detailed crack at that aspect of the problem!

This is just a quibble, however. It is rare to find a book on a complex topic that is so clearly written, compelling, and (dare I say it?) fun.  I loved it, and you will, too.

References

Koomey, Jonathan. 2008. Turning Numbers into Knowledge:  Mastering the Art of Problem Solving. 2nd ed. Oakland, CA: Analytics Press.

Koomey, Jonathan G. 2012. Cold Cash, Cool Climate:  Science-Based Advice for Ecological Entrepreneurs. Burlingame, CA: Analytics Press.

Mann, Michael E., and Tom Toles. 2016. The Madhouse Effect: How Climate Change Denial is Threatening our Planet, Destroying our Politics, and Driving Us Crazy. New York, NY: Columbia University Press.

Oreskes, Naomi, and Eric M. Conway. 2010. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury Press.

Our new Oil Climate Index 2.0 is live!

We just expanded and updated the Oil-Climate Index (OCI) from the first version that we released in 2015.

Project collaborators at the Carnegie Endowment for International Peace, Stanford University, and the University of Calgary have now collected enough open-source data to model the climate impacts of 75 global oils—25 percent of current production. Our results can be found on the new OCI 2.0 web tool at OCI.CarnegieEndowment.org. The OCI’s new look and functionality include a global oil map, oil field boundaries, flaring data, carbon tax calculator, in-depth comparison tools, information on related oils, and more.

This release features a new OCI publication, “Getting Smart About Oil in a Warming World.” And you can view demonstration videos that pose critical questions about oil-climate responsibilities and strategies. Stay tuned for a forthcoming report that highlights promising supply chain innovations in the oil sector. These and all related publications, events, and media are (or will be, for future pubs) archived on Carnegie’s OCI webpage.

We look forward to introducing the OCI 2.0 and its many energy and climate applications to you.

Reference

Koomey, Jonathan, Deborah Gordon, Adam Brandt, and Joule Bergeson. 2016. Getting smart about oil in a warming world. Washington, DC: Carnegie Endowment for International Peace.  October 5. [http://carnegieendowment.org/2016/10/04/getting-smart-about-oil-in-warming-world-pub-64784]

My online class, Modernizing Enterprise Data Centers for Fun and Profit, starts again next Monday (September 26th)

Cern datacenter
Photo credit: By Hugovanmeijeren (Own work) [GFDL or CC-BY-SA-3.0-2.5-2.0-1.0], via Wikimedia Commons

I’ve been struggling for years to convince executives in large enterprises to fix the incentive, reporting, and other structural problems in data centers.  The folks in the data center know that there are issues (like having separate budgets for IT and facilities) but fixing those problems is “above their pay grade”.  That’s why we’ve been studying the clever things eBay has done to change their organization to take maximal advantage of IT, as summarized in this case study from 2013:

Schuetz, Nicole, Anna Kovaleva, and Jonathan Koomey. 2013. eBay: A Case Study of Organizational Change Underlying Technical Infrastructure Optimization. Stanford, CA: Steyer-Taylor Center for Energy Policy and Finance, Stanford University.  September 26.

That’s also why I’ve worked with Heatspring to develop the following online course, the latest version of which starts September 26th and goes through November 6th, 2016:

Modernizing enterprise data centers for fun and profit

I wrote an article for the September 2015 issue of DCD focus with the same name, which describes the rationale for the class.

Here’s the course description:

This is a unique opportunity to spend six weeks learning from Jonathan Koomey, a Research Fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University, and one of the foremost international experts on data center energy use, efficiency, organization, and management.

This course provides a road map for managers, directors, and senior directors in Technology Business Management (TBM), drawing upon real-world experiences from industry-leading companies like eBay and Google. The course is designed to help transform enterprise IT into a cost-reducing profit center by mapping the costs and performance of IT in terms of business KPIs.

Executives in this course will gain access to templates and best practices used by leaders in your data center. You’ll use these templates to complete a Capstone Project, in which you will propose management changes for your organization to help increase business agility, reduce costs, and move their internal IT organization from being a cost center to a cost-reducing profit center.

I’m excited about this class, but we need more signups. Please spread the word!

Sign up, or find out more…

Also see the related super-short course for upper management:   Data Center Essentials for Executives:  A Beginner’s Guide

New online class:  Data center essentials for executives–a beginner’s guide

Photo Credit: University of Hertfordshire, licensed under a Creative Commons Attribution-Share Alike 3.0 unported license.

Many of you know that I’ve been teaching an online class about data transformation for a couple of years now.  That class, now titled Modernizing Enterprise Data Centers for Fun and Profit, is targeted at Director and Senior Director level executives who work for VPs and C level executives.  It delves into great detail about how to transform organizations to take full advantage of the power of information technology, and is scheduled to be given again between September 26th and November 6th, 2016.

I’ve now developed an introductory class (in collaboration with Heatspring) specifically targeted to VP and C level executives who want to transform their data centers into cost reducing profit centers.  It’s called Data Center Essentials for Executives–A Beginner’s Guide.    For a modest investment of time (about 1.5 hours in total) this short course offers a high level summary of steps every company can take to improve the business performance of its IT organization.

Students can sign up at any time and take the class whenever is convenient. I encourage those who sign up to reach out to me via email with specific questions.

Go here to sign up, or email me for more details!

Why “deep dive” journalism is in rapid decline

Mother Jones has a terrific piece describing the economics of doing big stories like the influential muckraking  piece they did on private prisons.  That story lead to dramatic results:

This June, we published a big story—Shane Bauer’s account of his four-month stint as a guard in a private prison. That’s “big,” as in XXL: 35,000 words long, or 5 to 10 times the length of a typical feature, plus charts, graphs, and companion pieces, not to mention six videos and a radio documentary.

It was also big in impact. More than a million people read it, defying everything we’re told about the attention span of online audiences; tens of thousands shared it on social media. The Washington Post, CNN, and NPR’s Weekend Edition picked it up. Montel Williams went on a Twitter tear that ended with him nominating Shane for a Pulitzer Prize (though that’s not quite how it works). People got in touch to tell us about their loved ones’ time in prison or their own experience working as guards. Lawmakers and regulators reached out. (UPDATE: And on August 18, the Justice Department announced that it will no longer contract with private prisons, which currently hold thousands of federal inmates—a massive policy shift.)

In the wake of our investigation, lots of people offered thoughts similar to this, from New Yorker TV critic Emily Nussbaum:

Incidentally,that Shane Bauer Mother Jones undercover investigation is literally why journalism exists and why we have to pay for it.

That’s a great sentiment, and we agree! But it also takes us to a deeper story about journalism and today’s media landscape. It starts with this: The most important ingredient in investigative reporting is not brilliance, writing flair, or deep familiarity with the subject (though those all help). It’s something much simpler—time.

And of course, time is money!  Here’s the key takeaway:

Conservatively, our prison story cost roughly $350,000. The banner ads that appeared in it brought in $5,000, give or take.

And this is the quandary in which the media find themselves.  The world is getting more complicated, and the need for “deep dive” factual journalism is greater than ever, but the cash cow of classified ads, which funded such activities in the past is all but gone, and the media world is under increasing financial pressure.  That’s why we probably need alternative business models for investigative media in our increasingly complex technological age.

Facts and Fiction in Energy Transitions – podcast posted today!

I had a wide ranging conversation with Chris Nelder that he recorded for his Energy Transitions podcast series, posted today.    We covered a lot of ground, as the description reveals:

Should we tweak our markets to keep nuclear plants alive, or forget about markets and pay for them another way… and do we really need them at all to keep the grid functioning? Is nuclear power really declining because of overzealous environmentalists, or are there other reasons? Is it possible to balance a grid with a high amount of variable renewables and no traditional baseload plants? Is cost-benefit analysis the right way to approach energy transition? How much “decoupling” can we do between the economy and energy consumption, and how can we correctly measure it? Why are we so bad at forecasting energy and economic growth, and how can we do it better? How will energy transition affect the economy?

Good thing I came up for air in between topics!  I enjoyed chatting with Chris–I always learn something from him.  And I can chalk up this opportunity to Twitter, because I met him through our Twitter interactions.

To listen, go here.

2015 State of the Climate:  Hot, Hot, Hot!

image

The Bulletin of the American Meteorological Society just released the 2015 state of the climate. Download it here.

NOAA gives a nice summary, which I condense to its key points below.

1. Greenhouse gases were the highest on record.
2. Global surface temperature was the highest on record.
3. Sea surface temperature was the highest on record.
4. Global upper ocean heat content highest on record.
5. Global sea level rose to a new record high in 2015.
6. Tropical cyclones were well above average, overall.
7. The Arctic continued to warm; sea ice extent remained low.

If anyone still has any doubt that the earth is warming and humans are responsible, they should read this document!

Surprise!: US data center electricity use has been growing slowly for years

image

Lawrence Berkeley National Laboratory, in collaboration with experts at Stanford (me), Carnegie Mellon, and Northwestern, today released our latest analysis of electricity used by data centers in the US.  Surprisingly, electricity use in data centers has been roughly flat since the financial crisis with little growth projected to 2020, even though delivery of computing services has been increasing rapidly.  As I’ve argued for years, the level of inefficiency in enterprise data center facilities leaves lots of room for improvement, and the market is finally getting that message.

Here are the first couple of paragraphs of the executive summary:

This report estimates historical data center electricity consumption back to 2000, relying on previous studies and historical shipment data, and forecasts consumption out to 2020 based on new trends and the most recent data available. Figure ES-1 provides an estimate of total U.S. data center electricity use (servers, storage, network equipment, and infrastructure) from 2000-2020. In 2014, data centers in the U.S. consumed an estimated 70 billion kWh, representing about 1.8% of total U.S. electricity consumption. Current study results show data center electricity consumption increased by about 4% from 2010-2014, a large shift from the 24% percent increase estimated from 2005-2010 and the nearly 90% increase estimated from 2000-2005. Energy use is expected to continue slightly increasing in the near future, increasing 4% from 2014-2020, the same rate as the past five years. Based on current trend estimates, U.S. data centers are projected to consume approximately 73 billion kWh in 2020.

Many factors contribute to the overall energy trends found in this report, though the most conspicuous change may be the reduced growth in the number of servers operating in data centers. While shipments of new servers into data centers continue to grow every year, the growth rate has diminished over the past 15 years. From 2000-2005, server shipments increased by 15% each year resulting in a near doubling of servers operating in data centers. From 2005-2010, the annual shipment increase fell to 5%, partially driven by a conspicuous drop in 2009 shipments (most likely from the economic recession), as well as from the emergence of server virtualization across that 5-year period. The annual growth in server shipments further dropped after 2010 to 3% and that growth rate is now expected to continue through 2020. This 3% annual growth rate coincides with the rise in very large “hyperscale” data centers and an increased popularity of moving previously localized data center activity to colocation or cloud facilities. In fact, nearly all server shipment growth since 2010 occurred in servers destined for large hyperscale data centers, where servers are often configured for maximum productivity and operated at high utilization rates, resulting in fewer servers needed in the hyperscale data centers than would be required to provide the same services in traditional, smaller, data centers.

Here’s the full reference:

Shehabi, Arman, Sarah Smith, Dale A. Sartor, Richard E. Brown, Magnus Herrlin, Jonathan G. Koomey, Eric R. Masanet, Nathaniel Horner, Inês Lima Azevedo, and William Lintner. 2016. United States Data Center Energy Usage Report. Berkeley, CA: Lawrence Berkeley National Laboratory. LBNL-1005775.  June 27. [http://eta.lbl.gov/publications/united-states-data-center-energy-usag]

Is stabilizing the climate impossible?

Your climate thought for today, via Goodreads:

“Impossible is just a big word thrown around by small men who find it easier to live in the world they’ve been given than to explore the power they have to change it. Impossible is not a fact. It’s an opinion. Impossible is not a declaration. It’s a dare. Impossible is potential. Impossible is temporary. Impossible is nothing.”

― Muhammad Ali

My webinar for DOE + EPA today: Why we can’t accurately forecast the future

image

Graph of energy and electricity/GDP indices taken from Hirsh and Koomey 2015 (subscription required).  The graphs illustrate structural changes in the relationship between energy and economic activity that have confounded modelers in the past and will no doubt do so in the future.

Today I gave a webinar for EPA and DOE staff titled “Past performance is no guide to future returns:  Why we can’t accurately forecast the future”.  I first gave a version of this talk at the Energy and Resources Group at UC Berkeley on September 28, 2011.  It built upon our Climatic Change article (with Irene Scher) titled “Is accurate forecasting of economic systems possible?” (subscription required), and grew into Chapter 4 of Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs, titled “Why we can’t accurately forecast the future” and my open access ERL article titled “Moving beyond benefit-cost analysis for climate change”.

Here’s the talk description:

This webinar explores why (with few exceptions) models of economic systems do not yield accurate predictions about the future. Predictions can be accurate when systems have consistent structure (geographically and temporally) and when there are no surprises, but neither of these conditions holds for virtually all economic systems. Physical systems can exhibit structural constancy, so predictions based on physical sciences can be accurate (barring surprises). The webinar also explores implications of this irreducible uncertainty, introduces ways to cope with it, and discusses responsible use of economic modeling tools in the face of such modeling limitations. The talk explores these issues using examples of forecasts of US primary energy use, oil prices, electricity demand, and the costs of nuclear power.

You can download the slides here.

Addendum, June 1, 2016: Some of the participants in the workshop wanted to understand why economic modelers have a hard time accepting the thesis of my talk.  I pointed them to a November 30, 2014 NYT blog post by Paul Krugman about the sociology of economics that is revealing.  The Krugman post refers to a study that will be fascinating for anyone interested in how the economic community operates.

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute