- GreenSpec Insights
- Energy Solutions
- BuildingGreen's Top Stories
- BuildingGreen Talks LEED
Maverick NYC mechanical systems designer Henry Gifford has long been a critic of LEED, arguing that it encourages the wrong things, and doesn't go far enough to ensure that certified buildings really save energy or provide good air quality. I have great respect for Gifford and the work he does to design and commission low-energy buildings with great ventilation on very tight budgets. Unlike too many practicing engineers, he knows exactly how much energy his buildings are using. Gifford is also a thorn in the side of many policymakers, because he has little patience for initiatives and programs that don't live up to his ideals.
Recently he's been distributing a paper attacking a study of actual energy use in LEED buildings. The study in Gifford's sights is from New Buildings Institute and USGBC, Energy Performance of LEED for New Construction Buildings. It analyzed actual energy usage in buildings that were certified based on predicted energy use.
The study compared actual to predicted energy use, and compared both to national average energy use in existing buildings as reported in the U.S. Department of Energy's Commercial Buildings Energy Consumption Survey (CBECS). USGBC and NBI reported on many interesting findings from that study, some of which were summarized in the December 2007 issue of EBN.
graphic from the NBI study
Gifford's paper is especially critical of the primary finding that LEED buildings were shown to be, on average, 25% to 30% more efficient than the national average. He provides an alternate analysis of the data that concludes that the LEED buildings are, on average, 29% lessefficient than average U.S. buildings.
The differences between Gifford's analysis and those of USGBC and NBI are based on two areas of disagreement:
1) First, the LEED buildings are compared to the CBECS data set of all existing buildings, regardless of year of construction. Gifford argues that they should have been compared only to new buildings. The 2006 CBECS summary shows that buildings built between 2000 and 2003 use, on average, about 10% less energy than the complete data set for all existing buildings.
NBI's Mark Frankel disagrees, noting that some of the LEED buildings are actually renovations of older buildings, so it may not be fair to compare them to new buildings. Further, he notes that CBECS generally groups its buildings by decade, and those three years don't represent enough of a trend to rely on. Historically, he points out, when CBECS published data for just a few years it looked better, only to worsen when the full decade's data were compiled. And the trend for full decades or more since 1920 shows that new buildings use just as much energy as old ones.
2) Gifford's second adjustment is to use the mean of the LEED data set instead of the median used by NBI. (The LEED mean was not published, but NBI provided it to Gifford upon his request.) Depending on who you choose to believe, NBI used the median because it made the LEED data look better (Gifford's contention), or because it was statistically the more meaningful approach (more on this below).
Interestingly, the distinction between mean and median isn't all that significant if you omit the "high energy use" building types (labs and data centers, primarily) that constitute 13% of the LEED data set. Omitting these makes some sense, because the CBECS data has a negligible number of such high energy using buildings. But if you include those buildings, the difference between mean and median is huge:
Without the high energy building types:
The CBECS numbers are means, so, Gifford argues, the LEED data should be analyzed based on means. (Actually, the CBECS numbers are averaged on a per square foot basis, meaning that larger buildings count for more. The LEED means are simple averages.)
By including all buildings in the LEED data set, and comparing based on mean instead of median, and comparing them to the CBECS 2000-2003 mean, Gifford shows that the LEED buildings' energy use exceed the CBECS baseline by 29% (105 divided by 81.6). On the other hand, median is often "a better indication of central tendency" than mean when the data is skewed (which the LEED data is). That's the same reason the authors give in their report for making that choice.
Also, the NBI study was peer reviewed by researchers from EPA, Pacific Northwest National Lab, and UC Berkeley, and none of them objected to this comparison. USGBC claims that other researchers who have since done further analysis using the data corroborate their approach as well. The NBI study used the median value rather than the mean, and compared it to the CBECS average for all existing buildings, to show that the LEED buildings use 24% less energy (69 divided by 91). I think that they could have just as easily have used the mean excluding the high-energy buildings (68) and gotten nearly the same result.
They did go much further, comparing building types in the LEED set with comparable buildings in the CBECS set, and found that the LEED buildings outperformed the CBECS buildings in every category except labs. (There is no category for labs in CBECS, but by any measure LEED labs aren't performing very well.) In the case of offices, the most common building type in both data sets, the median LEED buildings use 33% less than the CBECS average. Even without the labs and data centers the LEED buildings may be unfairly handicapped, because CBECS includes a lot of warehouses and vacant buildings, which use relatively little energy. But NBI chose not to adjust for that difference.
Gifford raises some other questions about the study, most notably the suggestion that the buildings for which actual data was provided likely performed better than those who couldn't or chose not to provide data. Given that 552 projects were contacted but data was only included from 121, this skepticism appears justified.
Frankel responds that at least some of those who supplied data had no idea how good or bad it was. (In one extreme case he contacted the owner right away to alert them to an energy hemorrhage.) He also notes that half of the 552 wanted to provide data, but some were rejected for various technical reasons, such as not having a full 12 months of data, or being located outside the U.S. Finally, they used statistical methods to test for this bias, but that's going over my head again.
In the end, I'm not entirely convinced on this one. Self-selection may have skewed the LEED results, at least a little. NBI's own responses to Gifford's challenges are posted here. Gifford doesn't raise the problem of first-year weirdness, although he does mention later in the paper that actual data should only be collected from year two of occupancy and beyond.
First-year data is often abnormally high, because systems haven't been fine-tuned. But it can also be low, if the building wasn't fully occupied for the entire year. I don't know how many of the 121 buildings in the study provided year-one data.
After attacking the NBI study on some good and some not-so-good grounds, Gifford gets back to addressing the core problem of predicted versus actual energy performance. On this front, he suggests that LEED plaques should be removable, and that someone should actually remove them if a building fails to live up to its promised performance.
That idea came up at early LEED meetings I attended, but was eventually abandoned as impractical. Gifford has an intriguing fall-back suggestion — rather than reward points based on predicted energy use, he suggests that mechanical system peak capacity would be a better indicator of performance. He doesn't propose how the baseline for that metric should be determined, however.
It's too bad that Gifford concentrated so much on attacking the study, because it's a distraction from the more important points he makes about how LEED is being misused. The good news is that LEED insiders share many of those same concerns, and are working on them. Everyone agrees that it's the actual performance, not the prediction, that really matters, and that more has to be done to improve that actual performance.
I did ultimately install a Geospring in the basement of my house (prettygoodlakehouse.com). I can't hear it at all in the living space but the...
Our Geospring water heater is located in our basement, separated from the living space only by an uninsulated floor. If the door to the basement...
Alex Wilson says, "Our Geospring water heater is located in our basement, separated from the living space only by an uninsulated floor. If the door to the basement is..." More...
jackson lui says, "i've been looking at these water heat pumps and the hidden costs and considerations are installation requirements (30A circuit), location of demand..." More...
Archives by Category
AIA Convention (18) [RSS]
Authors (7) [RSS]
Awards (7) [RSS]
Behind the Scenes (44) [RSS]
Books & Media (69) [RSS]
BuildingEnergy Conference (3) [RSS]
BuildingGreen Talks LEED (53) [RSS]
BuildingGreen's Top Stories (112) [RSS]
Bulletin (7) [RSS]
Case Studies (27) [RSS]
Colleges and Universities (2) [RSS]
Energy Solutions (303) [RSS]
Events (93) [RSS]
Google Earth/Sketchup (5) [RSS]
Greenbuild '07 (27) [RSS]
Greenbuild '08 (29) [RSS]
Greenbuild '09 (14) [RSS]
Greenbuild '10 (6) [RSS]
Greenbuild '11 (6) [RSS]
GreenSpec Insights (212) [RSS]
LEED (51) [RSS]
Living Future (6) [RSS]
Miscellania (41) [RSS]
Nature & Nurture (70) [RSS]
Op-Ed (67) [RSS]
Passive Survivability (7) [RSS]
Politics (32) [RSS]
Product Talk (102) [RSS]
Q&A (9) [RSS]
Resilient Design (11) [RSS]
Riversong's Radical Reflections (12) [RSS]
Science & Tech (30) [RSS]
Sticky Business (12) [RSS]
The Industry (97) [RSS]
Water Wise Guys (12) [RSS]