Rapid Building Energy Modeler: The Key to Fast Energy Efficiency

 Posted by Allan on April 21st, 2014

A multi-billion dollar market exists for reducing the energy use of existing buildings, if scientists can only figure out a way to substantially reduce the cost and time required to assess building energy performance, recommend energy performance measures, and identify problems in building operations.

This is the goal of RAPMOD, the Rapid Building Energy Modeler, a collaborative project involving the University of California, Berkeley, the Lawrence Berkeley National Laboratory (Berkeley Lab) and engineers Baumann Consulting. RAPMOD, which was funded by an innovation grant from the Advanced Research Projects Agency-Energy (ARPA-E), is designed to tackle this problem head on.

The technology, worn as a backpack, is designed to scan a building’s interior, using several types of sensors, as its wearer walks through the building. RAPMOD generates a visual map of the building that can be input into energy simulation models and used to develop an understanding of the building’s energy performance, leading to a list of recommendations for improving its efficiency.

RAPMOD is based on technology developed by Avideh Zakhor, Qualcomm Professor of Electrical Engineering at UC Berkeley’s Department of Computer Science and Electrical Engineering, and her students. Zakhor has been developing technology to produce indoor three-dimensional models since 2007 under the sponsorship of Army Research Office (ARO) and Air Force Office of Scientific Research (AFOSR). She developed the first fully automated fast outdoor mapping system in 2005, which was licensed by Google in 2007 to help produce its 3D Google Earth product.

Since then, Zakhor’s research group has been advancing the technology for use in indoor 3D mapping since 2007. In 2012, they teamed with a group of researchers led by Philip Haves, Leader of the Simulation Research Group in the Environmental Energy Technologies Division of Berkeley Lab, and with engineers Bauman Consulting, to adapt the technology to generate energy models of buildings quickly and inexpensively.

“It’s possible to reduce the energy consumption of existing buildings significantly,” says Haves, “through retrofitting – replacing old equipment with more energy-efficient technology – and through ‘retro-commissioning,’ the process of improving the routine operation of buildings by making equipment function properly.”

Prior research suggests there is a potential to reduce whole building consumption in the U.S. by 16 percent through retro-commissioning, which uses ‘low-cost or no-cost’ measures. This maps to an energy-savings potential of $30 billion by the year 2030 and annual greenhouse gas emissions reductions of about 340 million tons of CO2 per year. Retrofitting has significant cost but can result in energy savings of 20 to 50 percent. Energy modeling is required for the detailed analysis needed to achieve deep savings cost-effectively and is also helpful in maximizing and verifying the savings from retro-commissioning.

“The problem,” says Haves, “is that retrofits projects often ‘cream-skim’, saving about 10 percent while ignoring potential for deeper savings. We aim to reduce the cost, and improve the accuracy, of energy modeling to reduce the cost of identifying retrofit measures that will produce deep savings.”

Creating building energy models is expensive and time consuming and requires a lot of skill. Many existing buildings have incomplete, outdated, or no design documentation, requiring specialists to go into the building and laboriously make measurements that they can import into the software required to create the model. The primary goal of the RAPMOD project is to reduce the cost of preparing an energy model for use in retrofit analysis and in model-based retro-commissioning. This same model can also be used in performance monitoring during routine operation to detect equipment faults and other operational problems.

There are also non-energy applications of the technology. It could be used to create maps of building interiors for emergency first responders and Architecture Engineering Construction (AEC) companies could use the system to generate maps of the interior building structure and services (such as HVAC ducts, and gas, power, and water lines) during construction. Such maps would help building managers keep their buildings in good repair and running well during the building’s life. Game designers and real estate industry could also make use of interior mapping in their work.

Realizing that Zakhor’s 3D modeling technology offered a faster way of gathering the data needed for these models, Haves invited Zakhor to explore a collaboration between her lab and Berkeley Lab’s Simulation Research Group. Bauman Consulting was brought in to advise on industry practices and costs and conduct testing and demonstration. A prototype version of the RAPMOD system was shown at ARPA-E’s Technology Innovation Conference in late February 2014, and, a day later, demonstrated for member of Congress at a showing on Capitol Hill.

 

How It Works

RAPMOD is fitted with several different sensors, including a LiDAR, which measures the distances to building surfaces using a laser, a visible light camera, and an infrared sensor. The camera and LiDAR generate a photorealistic three-dimensional model of the building’s interior as the user walks through hallways, into rooms, and up and down staircases.

The infrared sensor measures the thermal properties of windows and detects thermal defects e.g. in wall insulation or moisture leaks. It also measures the heat coming from lighting systems, other equipment, and building occupants, providing the model with information required to calculate the energy required to heat and cool the buildings.

A major advantage of RAPMOD is that it doesn’t need to be operated by high-cost-energy experts. Technicians will be able do the building walkthrough and measured data will upload automatically for processing and importing into the energy modeling software. All this drives down the cost of producing the model substantially.

One of the major tasks in the research has been to integrate the infrared sensor into the equipment, and to determine how much it can tell users about the thermal characteristics of the building materials—the insulation in the walls, and the windows’ the U-values, a measure of how well they perform at retaining heat to the interior.

A first version of the RAPMOD system that maps building geometry is expected to be ready for field testing in the summer of 2014. A version that measures window properties and characterizes internal heat gains is expected to be ready for field testing and demonstration by the end of 2014.

The research team is now seeking partners among architect, engineering and construction firms, consulting engineering firms, facility managers, energy service companies, and others to help with testing and demonstrating the technology in existing facilities.

For more information, contact: Philip Haves, PHaves@lbl.gov

 

Berkeley Lab Launches Building Energy Performance Research Project at New FLEXLAB Testing Facility

 Posted by Allan on April 14th, 2014

April 14—FLEXLAB™, the Facility for Low Energy experiments in Buildings, run by Lawrence Berkeley National Laboratory’s Environmental Energy Technologies Division (EETD), has partnered with construction firm Webcor to test building energy performance. The testing will allow Webcor’s engineers to predict and improve the energy performance for a new building constructed for biotech company, Genentech, a member of the Roche Group. A building mockup for Genentech will be studied at different building orientations, specific to the actual construction site.

The research will take place in FLEXLAB’s rotating testbed, a unit that rotates 270 degrees to allow the study of building energy use and environmental parameters change in a variety of orientations relative to the sun. FLEXLAB’s newly completed outdoor facility consists of four testbeds, consisting of two cells each, that can be outfitted in almost any combination of building envelope materials, windows and shade structures, lights, heating and cooling equipment, and controls to test new technology in real conditions. Two other testbeds, one for lighting and controls testing, and one for collaborative building design have opened in an existing building at Berkeley Lab.

Berkeley Lab and Webcor will build a section of the building in the rotating testbed, collect data, and use these measurements to develop an accurate energy simulation model of the building.  This practice allows the design team to better understand and predict the actual performance of the building, both for energy and comfort. The lab is the only facility in the U.S. that provides side-by-side, outdoor testing of fully integrated building systems (envelope, lighting, HVAC) in a fully reconfigurable space.

The U.S. Department of Energy’s David Danielson, Assistant Secretary for Energy Efficiency and Renewable Energy, was on hand in Berkeley to tour the facility, meet with Webcor executives, Lab Director Paul Alivisatos and Berkeley Lab researchers, and view the start of the installation.

“The Energy Department’s FLEXLAB is an exciting contribution that will help industry test comfortable, low-energy-use buildings technologies” said Danielson. “By advancing technologies that reduce our energy use in built environments, this project also brings us closer to meeting the ambitious goals of the President’s Climate Action Plan and keeps America on the path to a clean energy future.”

“We have to agree with other experts that this building, this facility, could be the most important building in the country. The DOE, Berkeley Lab and its team have handed us a uniquely powerful tool and now it is the architecture and engineering community’s opportunity to put it to use,” says Phil Williams, Webcor’s Vice President for Building Systems and Sustainability.

Webcor will ensure that the building operates according to specification when construction is finished. Operations and maintenance staff, and occupants will provide input on the design early on, understanding how the integrated systems will work in operation. They will provide input on operations as well as functionality and comfort. This allows the team to identify potential performance and cost issues early on and address them, lowering the risk of delivered performance and the potential for costly change orders during the construction process.

“FLEXLAB will bring industry, DOE national lab scientists, manufacturers and investors together all working hand in hand on cutting-edge energy efficiency technologies and solutions,” says Berkeley Lab Director Paul Alivisatos. “These partners can use the results from their FLEXLAB demonstrations to help encourage design, construction and operation of high-performance buildings.”

Mocking up and testing advanced designs will allow the buildings industry to build a case for new advanced technology, and scale up the opportunities for construction and operations cost reduction. Ultimately, this benefits consumers, by leading to better designed and operated, more comfortable buildings with lower energy costs.

For more information about FLEXLAB, see http://flexlab.lbl.gov/.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.

 

Intergovernmental Panel on Climate Change’s Working Group III, mitigation of climate change, issues its report summary

 Posted by Allan on April 14th, 2014

The Intergovernmental Panel on Climate Change’s Working Group III, addressing the mitigation of climate change, has issued its Fifth Assessment report’s executive summary. The report updates policymakers on the technical and socio-economic aspects of climate change, including technologies and policies that can reduce impacts.

According to its website, “The IPCC Working Group III assesses all relevant options for mitigating climate change through limiting or preventing greenhouse gas emissions and enhancing activities that remove them from the atmosphere.”

A number of researchers in the Environmental Energy Technologies Division (EETD) of Lawrence Berkeley National Laboratory (Berkeley Lab) have lent their expertise to the writing, editing and review of this report:

Lead Authors:
Ryan Wiser, Chapter 7, Energy Systems
Ashok Gadgil, Chapter 9, Buildings
Lynn Price, Chapter 10, Industry

Contributing Authors:
Chapter 7, Energy Systems: Naim Darghouth, Ben Hoen, Peter Larsen, Andrew Mills.

Chapter 10, Industry: Stephane de la Rue du Can, Ali Hasanbeigi.

Review Editor:
Jayant Sathaye, Chapter 4, Sustainable Development and Equity

Department of Energy Reviewer

Tim Xu, Chapter 9, Buildings, U.S. government lead reviewer (Xu reviewed the document for the Department of Energy, not as an IPCC reviewer)

James McMahon, former head of EETD’s Energy Analysis Department, now an independent consultant, also participated as a Lead Author in Chapter 12, Human Settlements, Infrastructure and Spatial Planning.

Download the WGIII Fifth Assessment report here.

Lynn Price LKPrice@lbl.gov 510-486-6519

Jayant Sathaye JASathaye@lbl.gov 510-486-6294

Ryan Wiser RHWiser@lbl.gov 510-486-5474

EETD’s Vehicle-to-Grid Simulator Will Help Plug-In Electric Vehicles Become an Electric Grid Resource

 Posted by Allan on April 10th, 2014

Plug-in electric vehicles (PEVs) are here, and more are coming. In mid-2013, a buyer took delivery on the 100,000th PEV sold in the United States, and the number is growing. One study forecasts that more than a million plug-in hybrid electric vehicles (PHEVs) will be sold in California, New York, Washington, and Florida alone between 2013 and 2022. Electric vehicles (EVs) are also growing in range and sales. As the cost of battery packs comes down, the number of car shoppers willing to consider buying EVs will go up.

The growth of the PEV fleet means that an unplanned but potentially valuable energy storage resource is also growing—the battery packs of these vehicles. When PEVs are plugged in, they represent an opportunity to better manage the electricity grid. For instance, PEVs can be used to avoid potential shortages of electricity during peak times, provide extra storage capacity when the grid is generating more than it needs to satisfy demand, and encourage the growth of renewable energy by providing a buffer to balance out the intermittency of wind and solar generation.

Providing these vehicle-to-grid (V2G) services can also help the automotive market. The revenue from using a PEV to provide energy services to the grid could help offset some of the increased cost of purchasing the PEV. The increased incentive to buy PEVs replaces high emissions vehicles with no or low emissions vehicles. This shifts air pollution away from population centers, helps meet increasingly stringent emissions regulations, and assists car manufacturers with meeting future CAFE (corporate average fuel economy) standards.

There is a need for proven technologies that can predict the grid availability of a collection of independently operated vehicles. Yet the electricity grid needs to precisely match the demand for power from second to second with supply, drawing on a variety of sources ranging from base and peaking power plants to intermittent sources such as wind and solar power.

How can electricity grid managers, government authorities, power markets, entrepreneurs, and other stakeholders harness the resource offered by the growing fleet of PEVs? A project at the Environmental Energy Technologies Division of Lawrence Berkeley National Laboratory (Berkeley Lab) may help to answer that question.

“Many are saying that energy storage is important to the electricity grid, because it can be a buffer for the grid by providing power when it needs to smooth out sudden increases in demand or shortfalls in supply, or by storing power when an excess is available on the grid. Energy storage is high-value if it is able to respond quickly. The battery packs in electric vehicles are potentially very quick, compared to conventional sources today,” says Samveg Saxena, a researcher in EETD’s Grid Integration Group. When plugged into the grid, these vehicles could be a significant resource.

But there are many uncertainties that make using PEVs difficult: at any time, some PEVs are parked and charging up from different states of power depletion, and others are in use on the roads, so the capacity of the PEV fleet is always changing. Despite the opportunities, there are still many uncertainties, such as the effects upon battery degradation and battery lifetime from storing and sending power to and from the electricity grid, or whether enough PEVs can be tapped exactly when they are needed to meet grid demand.

Beyond this, says Saxena, “automotive and electricity utility stakeholders have not historically had to deal with these challenges together. Automotive battery manufacturers have to make sure that these grid services won’t degrade the batteries. Electric grid operators have to make sure that PEVs can function effectively as a grid resource.”

To study these issues, Saxena, EETD researcher Jason MacDonald and UC Berkeley/EETD Professor Scott Moura have been developing a simulation platform called the Vehicle-to-Grid Simulator, or V2G-Sim.

“V2G-Sim’s purpose is to be a simulation platform that couples sub-modules that address these concerns in a systematic way. It will help us understand the challenges of vehicle-to-grid services as well provide a platform for thinking through solutions, and simulating the effect of those solutions on the grid quantitatively,” he explains.

The team’s goal for V2G-Sim is to provide a platform for electric grid system operators, utilities, policy-makers, battery and PEV manufacturers, researchers, and the business community. Each stakeholder may study and evaluate their perspectives on utilizing PEVs for energy services to the electric grid (one example is what the utility community calls ancillary services).

V2G-Sim models the usage of individual vehicles, including second-by-second energy use while driving or charging, and aggregates large numbers of simulated vehicles to produce grid-scale predictions of impacts and opportunities from vehicle-grid integration. The results are time-based models of vehicle behaviors as well as a spatial simulation of their location. Using the National Household Travel Survey, the development team has created profiles of vehicles approximating real-life situations. For example, it could emulate a car that charges overnight, leaves for work at 7:30AM, parks, runs an errand at lunchtime, and then drives home at 5:30pm and plugs in to recharge. This is one of many scenarios modeled with statistical variations derived from real-world commuting data.

The preliminary version of V2G-Sim that the EETD team has created incorporates modules that address different aspects of the problem. Powertrain modules calculate the vehicles’ states of charge and energy use second-by-second. Battery electrochemistry modules calculate the electricity inputs and outputs and changes to their internal chemistry. Battery degradation models integrated into V2G-Sim estimate the impact of battery use on its life—how many years it can last when being used for driving only versus driving plus grid services.

Figure 1 shows the results of a test case with V2G-Sim—how demand from the grid is using electricity from 1,000 PEVs second-by-second over 24 hours starting at midnight. Figure 2, from the same test, shows the activity profiles of 12 individual vehicles—their states of charge as a function of time. Sometimes they are plugged in and charging, other times, they are unavailable. Figure 3 shows an example of the spatial resolution from V2G-Sim. In this example spatial charging is resolved for 659 PEVs at home and work locations in the San Francisco Bay Area, but V2G-Sim enables much finer spatial resolution of vehicle charging, for example by neighborhood.

 

Slide1

Figure 1. V2G-Sim test case results—grid demand for electricity from 1,000 PEVs over 24 hours.

 

Slide2

 

Figure 2. V2G-Sim activity profile for 12 vehicles.

 

Slide3

 

Figure 3. Spatial charging for 659 PEVs at home and work locations in the San Francisco Bay Area.

“In the near term,” says Saxena, “our vision is to release ‘V2G-Sim Analysis’ as a research tool to improve the cross-disciplinary understanding of how V2G services could perform, the impact that vehicle-grid integration will have on individual vehicles and on the grid, and how grid infrastructure can be planned for more PEVs.” V2G-Sim Analysis will provide a valuable research tool for many parties to quantitatively understand the challenges from vehicle-grid integration, including grid operators, utilities, policy makers, battery manufacturers and the business community. But they plan to follow up with another version of the platform, ‘V2G-Sim Operations’, to enable real-time operations of a grid, which uses many PEVs as a resource. “The approach we’re taking,” he says, “is to develop a tool that will have a broad impact, and to ramp up the real-time use of PEVs to provide rapid energy response services to the grid.”

In this vision, battery manufacturers might use V2G-Sim Analysis to link to their electrochemical models of battery technologies to quantify battery degradation and devise ways of making long life-cycle batteries that effectively provide both vehicle propulsion and electric grid services. Advanced battery technology researchers at Berkeley Lab and elsewhere are already interested in using V2G-Sim in their studies of the electrochemistry of battery degradation. PEV manufacturers could link the platform to their own vehicle design platforms to adapt powertrain design for electric grid integration.

Grid managers could eventually use the Operations version of the platform to coordinate PEV resources in real-time for grid services such as smoothing the electricity supply curve from the intermittence of renewables such as wind and solar power. This, in turn, might help encourage greater use and integration of renewable power sources on the grid, because its managers have a greater ability to compensate for power fluctuations from these sources.

Utilities and business entrepreneurs could use V2G-Sim Analysis to develop and understand the impact of managed charging control algorithms for PEVs so that they can provide optimal service both as a vehicle and as an energy service to the grid.

Entrepreneurs interested in creating a business by harnessing the fleet of PEVs of a region with a large PEV stock could use V2G-Sim to make informed decisions about how much regulation capacity to bid into the grid’s markets. Information such as the composition of the PEV fleet, the state of charge and availability of energy at any time of the day or night, the locational availability of power throughout the region—where PEVs are connected to the grid—all influence these decisions.

The electricity regulatory community can use the same type of information to integrate the regulation of V2G services into the current regulatory framework of the grid.

“Our short-term goal,” says Saxena, “is to release V2G-Sim Analysis to the research community, automotive and battery manufacturers and grid stakeholders. In the long term, we want to further validate the model so that system operators and the regulatory community can begin to use PEVs in realtime as part of the dynamic electricity system.”

For more information, contact: Samveg Saxena, SSaxena@lbl.gov

This research was funded by Berkeley Lab’s Lab-Directed R&D program.

Program Administrator Cost of Saved Energy for Utility Customer-Funded Energy Efficiency Programs in the United States

 Posted by Allan on March 20th, 2014

 

As more states and utilities increasingly turn to energy efficiency programs to manage demand for electricity and natural gas, it is important to understand how much saving energy costs. By examining regulatory reports on efficiency programs in 31 states, Lawrence Berkeley National Laboratory (Berkeley Lab) researchers have determined the cost of saving energy through efficiency programs funded by utility customers in the period 2009-2011. A report published today presents those costs at the national and regional level for all sectors and the most prevalent program types.

The study was written by Megan A. Billingsley, Ian M. Hoffman, Elizabeth Stuart, Steven R. Schiller, Charles A. Goldman, and Kristina LaCommare of the Environmental Energy Technologies Division of Berkeley Lab.

Berkeley Lab researchers gathered a total of more than 4,000 program-years’ worth of costs and energy savings data, as reported by 107 program administrators on 1,700 individual efficiency programs. Administrators of those programs typically use different names for their programs, and practices vary among states for reporting program cost and savings information. The researchers developed a standard approach to handling the data and classifying the programs. The result is the most comprehensive and detailed program database reported to date, offering a rich portrait of national and regional efficiency program investments and covering more than 60 different types of efficiency programs. In the report, calculations of the cost of saved energy (CSE) are based upon gross energy savings and the costs borne by the program administrator. Cost contributions by program participants are infrequently reported by program administrators, and thus the reported results are not the “all-in” cost, known in the industry as the total resource cost.

The report provides a levelized CSE by targeted sector (residential, commercial, industrial and low income customers) and program type, as defined by technology, action or delivery approach. For each market sector and program type, the report identifies a range of costs as reported by energy efficiency program administrators including the median CSE value and a savings-weighted average CSE.

Among the key national and regional findings:

  • The U.S. weighted-average electricity CSE was slightly more than two cents per kilowatt-hour. While this levelized CSE is somewhat lower than values reported by other studies, it should be noted that this study contains the largest sample of program administrators to date. Furthermore, nearly 40% of the electric program administrators in the LBNL database have offered programs for less than four years and so may be early in accessing energy savings in their respective state economies or may be targeting the least costly savings opportunities first.
  • Residential electricity efficiency programs had the lowest average levelized CSE at $0.018/kWh. Lighting rebate programs accounted for at least 44% of total residential lifetime savings with a savings-weighted average levelized CSE of $0.007/kWh.
  • Commercial, industrial and agricultural efficiency programs had an average levelized CSE of $0.021/kWh.
  • Efficiency programs in the Midwest had the lowest average levelized CSE ($0.014/kWh) while programs in Northeast states had a higher average CSE value ($0.033/kWh). The average CSE for programs in the West and South were $0.023/kWh and $0.028/kWh, respectively. Note that only four states in the South are included in this report.
  • Natural gas efficiency programs had a savings-weighted average levelized CSE of $0.38 per therm, with significant differences between the commercial/industrial and residential sectors (average values of $0.17 vs. $0.56 per therm respectively).
  • Annual regulatory reporting on the costs and savings of efficiency programs is inconsistent in quality and thoroughness. Not surprisingly, program administrators in different states often use varying definitions of savings and program costs. Market sectors and program types are not characterized in a standard fashion. Many program administrators do not provide the basic data needed to calculate a levelized cost of saved energy at the program level.

The report asserts that there is a direct connection between the maturation of energy efficiency as a resource and the need for consistent, high-quality data and periodic reporting of efficiency program costs and impacts. The report urges state regulators and program administrators to consider annually reporting certain essential data at a portfolio level and more comprehensive reporting of program-level data (e.g., lifetime energy savings, participant costs). Policymakers and system planners have shown increasing interest in integrating energy efficiency as a resource, and the value of transparent and complete reporting of program metrics is a foundation for increasing their confidence in this resource.

The report “The Program Administrator Cost of Saved Energy for Utility Customer-Funded Energy Efficiency Programs” was written by Megan A. Billingsley, Ian M. Hoffman, Elizabeth Stuart, Steven R. Schiller, Charles A. Goldman, and Kristina LaCommare of the Environmental Energy Technologies Division.
Full text of the report and a summary presentation may be downloaded here.

A webinar on the findings will be presented 2 p.m. Eastern/11 a.m. Pacific, Wednesday, April 2. Register here.

Other publications from the Electricity Markets and Policy Group can be found here.

This research was funded by the U.S. Department of Energy’s Office of Electricity Delivery and Energy Reliability.

OpenADR Standard Published as an International Publicly Available Specification

 Posted by Allan on March 18th, 2014

In February, Open Automated Demand Response (OpenADR) achieved another milestone toward becoming an international standard when the International Electrotechnical Commission (IEC), a renowned standards development organization, released a profile of OpenADR 2.0 as a Publicly Available Specification (PAS). This action recognizes OpenADR as a standard that will enable our electricity systems to be more responsive and smarter about operating under numerous economic, environmental, and security restraints. OpenADR 2.0 is already a national standard in the United States, as the result of Smart Grid standards interoperability activities coordinated by the National Institute of Standards and Technology (NIST) and Smart Grid Interoperability Panel (SGIP).

Demand Response Research Center (DRRC) researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) originally conceived of and developed the OpenADR specification in 2002, to support automated demand response and dynamic pricing electricity programs. Since then it has been further developed by the Organization of Structured Information Standards (OASIS) and has become a national standard that is widely supported by Smart Grid stakeholders and vendors. It is an integral element of Smart Grid activities worldwide. The OpenADR Alliance, a nonprofit organization with more than 100 members, is now responsible for its adoption and is testing a certification authority for an OpenADR 2.0 standard.

“We’re very pleased by this action,” said Girish Ghatikar, Deputy Leader of Berkeley Lab’s Grid Integration group and Vice-Chairman of the OpenADR Alliance. “Its acceptance by the IEC demonstrates that vendors, standards organizations, and users alike recognize OpenADR’s broad usefulness in enabling electricity service providers and customers to participate in demand response transactions.”

A primary OpenADR focus has always been on empowering customers with choices to manage their energy use and save money. For scaled adoption of standards such as OpenADR across national and global Smart Grid deployments, regulatory and policy mechanisms are recognizing the importance of standards to overcome any market adoption barriers swiftly and effectively. Through appliance standards, buildings codes, and design specifications, OpenADR can enable a fleet of buildings, equipment, and appliances to participate in demand-side management programs that will deliver energy-cost savings, grid transactions, and environmental benefits. Such developments are fundamental to help national and international market actors realize the benefits associated with the Smart Grid more swiftly and in a manner that ensures greater security, interoperability, and reduced cost to society. For example, California’s Title 24 building code now requires that standards-based messaging protocols such as OpenADR be included as a part of building energy controls. As more states and countries adopt the OpenADR standard, its use in the Smart Grid will expand, offering customers additional choices in how they use energy.

Development of OpenADR is supported by the California Energy Commission’s Public Interest Energy Research Program. The OpenADR Publicly Available Specification is IEC PAS 62746-10-1.

More information

Girish Ghatikar, (510) 486-6768, GGhatikar@lbl.gov

http://www.openadr.org/specification

EETD and CalCharge are Partners in new California Energy Commission-funded Northern California Advanced Vehicles Center

 Posted by Allan on March 12th, 2014

The California Energy Commission has funded a Northern California Alternative Transportation Fuel and Advanced Vehicle Technology Center (North CAT) to be based at the University of California, Berkeley. The Environmental Energy Technologies Division of the Lawrence Berkeley National Laboratory is a partner, as is CalCharge, the public-private partnership between Berkeley Lab and the California Clean Energy Fund working to accelerate the development, commercialization, and adoption of new energy storage technologies. The consortium will receive more than $3 million to establish the North CAT Center.

“North CAT’s goal is to provide the up-to-date information and assistance that vehicle fleet managers, city planners, policymakers, and first-responders need to quickly and easily understand the costs, benefits, and implementation issues for latest new vehicle technologies,” said Tim Lipman, co-director of TSRC.

Anand Gopal, a researcher in EETD’s International Energy Studies Group will lead Berkeley Lab’s effort in Center, joined by Samveg Saxena of EETD’s Grid Integration Group, and Venkat Srinivasan, Head of EETD’s Energy Storage and Distributed Resources Department.

“Our role will be to lead the center’s efforts in technology related aspects of battery electric vehicles and their integration into the electricity grid,” says Gopal. “We’ll bring EETD’s considerable strengths in advanced energy storage technology development, vehicles-to-grid research, international transportation, and policy and market analysis to make the Center a success. We are also looking forward to expanding our activities with CalCharge in this Center.”

Read the UC Berkeley press release here: http://its.berkeley.edu/news/ITS/20140307

CalCharge: http://www.calcharge.org/

International Energy Studies Group: http://ies.lbl.gov/

Energy Analysis and Environmental Impacts Department: http://eaei.lbl.gov

Grid Integration Group: http://gig.lbl.gov/

Energy Storage and Distributed Resources Department: http://esdr.lbl.gov/

 

Annex 66 Seeks to Standardize Studies of Occupant Behavior

 Posted by Allan on March 3rd, 2014

It is well-documented that prodigious amounts of energy and money have been saved by energy-efficient building technologies. California alone has saved billions of dollars, prevented tons of pollutants, and avoided having to build additional power plants thanks to its efficiency efforts. Still, the effectiveness of these technologies is dependent on building occupants not only using them, but using them properly. How much of an effect occupant behavior has on energy savings is uncertain, but most researchers agree that it is significant. More than 20 groups worldwide study building occupant behavior, but experimental design and modeling methodologies differ, and many studies lack detailed quantitative analyses.

In recognition of this problem, the IEA (International Energy Agency) Energy in Buildings and Communities (EBC) Programme approved the Annex 66 project at its Executive Committee Meeting in 2013. The project will establish a standard occupant behavior definition platform and a quantitative simulation methodology to use to model occupant behavior in buildings. This standardization will make studies easier to compare and give researchers a deeper understanding of occupant behavior on energy use and the indoor environment.

The Annex 66 project has five subtasks:

  1. occupant movement and presence models
  2. occupant action models in residential buildings
  3. occupant action models in commercial buildings
  4. integration of occupant behavior definition and models with current building energy modeling programs
  5. applications in building design and operations

Environmental Energy Technologies Division researcher Tianzhen Hong and Da Yan from Tsinghua University, China, are the operating agents for Annex 66. They manage the work of the various participants and supervise the production, financing, and availability of Annex 66 products. Hong is also subtask leader for Subtask D, and will work to integrate that work with the current modeling programs.

At this writing, 56 organizations in 23 countries have expressed strong interesting in participating in the project, which began in November 2013. This preparation phase will continue until October 2014, when its working phase will begin and last for two years. Project results will be reported in 2017.

“To achieve significant carbon reductions and mitigate global climate change, it’s necessary to have a deep understanding of occupant energy behavior in buildings,” says Hong. “Being able to model and quantify how behavior impacts the use of building technologies and building energy performance using scientific methods is crucial to the design and operation of low-energy buildings.”

More Information

Tianzhen Hong, 510-486-7082, thong@lbl.gov

Annex 66 website: http://www.annex66.org/

Simple and Elegant Building Energy Modeling for All—A Technology Transfer Tale

 Posted by Allan on February 20th, 2014

A building owner changes the building’s thermostat setting, allowing the indoor temperature to increase a couple of degrees for an entire afternoon. But how much energy was actually saved? To know the answer for sure, the energy actually used must be compared to the energy that would have been used if they hadn’t made the change…but how does the building owner find that out?

The answer is provided by a “baseline energy model,” a statistical formula that, based on the analysis of previous energy use, takes into account the time of day, the day of the week, and the outdoor air temperature to predict the building’s energy consumption as a function of time if the building were operated normally.

When Environmental Energy Technologies Division (EETD) researcher Phillip Price began working with building energy data a few years ago, he discovered that the standard baseline energy models are very simple. “And that makes sense,” says Price, “because usually the only useful explanatory variable you have is outdoor air temperature. If your only variables are time and temperature, you may not get much benefit from a complicated model.” EETD is a Division of the Lawrence Berkeley National Laboratory (Berkeley Lab).

But he also discovered that the standard approaches have some flaws, so he made improvements. He developed a model that produces more accurate predictions in most buildings, but isn’t much more complicated than previous models. He and graduate student Johanna Mathieu published the model in 2011.

Price has worked with statistical models for 20 years at Berkeley Lab. In the early 1990s he became one of the foremost authorities on the spatial and statistical distribution of indoor radon, and co-developed state-of-the-art algorithms for mapping airborne pollutant concentrations using optical remote sensing data.

He turned his hand to statistically modeling electric load data several years ago. While researching for his model development, Price noted changes happening in the industry, which dramatically increased the amount of energy use data available.

“Until not too long ago, the only energy use data people had were monthly use data,” Price said. “Now that we have ‘smart meters,’ we’ve started getting more data and that has opened up many possibilities. The idea was to look at what people were doing and see if we could do better—and that effort led to this model.”

Looking at energy use in much smaller increments of time led to additional needs for modeling. Price wanted to include a simple way for building designers and operators to measure the outcome of energy efficiency measures looking separately at the effect of temperature during different building occupancy and use scenarios, as well as taking into account the use of many different pieces of equipment and technology.

“What you want in a decent model is something that takes temperature into consideration in a different way when the building is operating and when it is not,” Price said. “As an example, suppose we have a night and day that are 75 degrees all the way through. At night, we wouldn’t be conditioning the building; during the day, you are cooling. Dependence on temperature is also something that changes,” he said.

“Also, even when the heating and cooling systems are operating, there is usually some range of temperature when you don’t have to cool or heat. Below that range you need to heat, above that range you need to cool. If the outdoor temperature gets warm enough, at some point the cooling system is working flat out and you can’t use more, so energy use stays the same. So rather than a straight-line dependence on temperature, it can be a curvy line,” he said.

The model has been helpful to a number of industry organizations that have used his methods as a basis for business functions and tools.

Tom Arnold, co-founder and CEO of Gridium, a Bay Area company that helps commercial customers make sense of the flood of energy data provided by smart meters, ran across a paper written by Phil Price about his model and used it to catalyze his work early in the company’s history.

“His model was a kernel that catalyzed our work,” Arnold said. “I think this is quite common; you get a little bit of government-funded research and it’s digested by the private sector,” he said.

Arnold said that he needed to find a model that applied time-theory statistics to the energy use issues his customers face, and the model was a simple way to begin approaching it. Now, Arnold’s company analyzes data from 110 million square feet of buildings, using Cloud computing to run tens of thousands of modeling scenarios across many servers, providing weekly analysis of building energy performance for customers.

A Berkeley company called QuEST also used the model as a foundation for their work. QuEST works in commercial utility programs to identify energy efficiency measures and verify energy savings.

“I had a research project from the California Energy Commission to develop a measurement and verification tool, so we needed an energy model,” said David Jump, principal at QuEST. “We talked with LBNL, Phil, and some others. They had already developed this modeling code for other projects, and we went with this one because we wanted to model energy use on as frequent as 15-minute intervals over the period of a day,” he said.

Jump said they started with Pacific Gas & Electric’s desktop tool, Universal Translator, which is fairly well known in the energy efficiency industry, and then added measurement and verification capability based on Price’s model.

“I introduced our tool at the last ASHRAE meeting, and everyone wanted a copy,” Jump said. “Here is a tool that has soup-to-nuts measurement and verification capabilities, all open source. And it basically eliminates the need for highly skilled, high-priced people to do modeling work,” he said.

So far, companies that have been using Price’s approach have had to write their own computer code based on his publications, but that’s about to change: an improved version of the model is soon to be released as part of an open source software distribution.

Price continues to work on baseline energy models, and he is excited about a new approach he is developing.

“The simple model works fine for predicting energy consumption in the near future, but it’s not ideal for other applications such as detecting sudden changes in a building’s energy consumption pattern,” said Price. “The new approach is very well suited to that kind of application, but you pay a price in increased complexity. I think the simple model is going to be around for a while,” he said.

http://eetd.lbl.gov/news/article/57523/simple-and-elegant-building-ene

Researchers build nonflammable lithium ion battery

 Posted by Allan on February 10th, 2014

The following press release was published on the University of North Carolina’s news website today, and a paper reporting the results has been published in the Proceedings of the National Academy of Science. Read the full text at the link below. Nitash Balsara, faculty senior scientist at Lawrence Berkeley National Laboratory and professor of chemical and biomolecular engineering at the University of California, Berkeley, and his team are working with the UNC team team to study the new electrolyte described in this story.

 

In studying a material that prevents marine life from sticking to the bottom of ships, researchers led by chemist Joseph DeSimone at the University of North Carolina at Chapel Hill have identified a surprising replacement for the only inherently flammable component of today’s lithium-ion batteries: the electrolyte.

The work, to be published in the Feb. 10 issue of the Proceedings of the National Academy of Sciences, not only paves the way for developing a new generation lithium-ion battery that doesn’t spontaneously combust at high temperatures, but also has the potential to renew consumer confidence in a technology that has attracted significant concern—namely, after recent lithium battery fires in Boeing 787 Dreamliners and Tesla Model S vehicles.

Read the rest at the link below: https://uncnews.unc.edu/2014/02/10/researchers-build-nonflammable-lithium-ion-battery/

Efficiency Entrepreneurs Approaching Max Tech

 Posted by Allan on February 10th, 2014

Smarter power plugs and appliances that can turn electronics on and off according to a homeowner’s behavior rather than a fixed schedule. A water heater that delivers water to the different fixtures in the house at temperatures customized for the particular water use. A smarter-than-ever thermostat that knows how to set different temperatures for different rooms in a house. High-pressure water jets that can cut magnetic metal to construct high-precision, high-efficiency motors at lower cost. These are just some of the energy efficiency technologies and prototypes that are being developed and tested as part of the 2013-2014 Max Tech and Beyond Design Competition. The annual competition, which is run by Lawrence Berkeley National Laboratory (Berkeley Lab) with funding from the Department of Energy’s (DOE) Building Technologies Office, challenges college students to design ultra-low-energy-use appliances and supports the education of the next generation of U.S. clean energy engineers.

Twelve teams from U.S. colleges and universities across the country are currently competing to build and test the most energy-saving prototype. Teams include a faculty lead and at least three students-undergraduate, graduate, or a mix. According to Berkeley Lab’s Robert Van Buskirk, who mentors the teams: “As a Department of Energy National Research Laboratory, Berkeley Lab helps the university teams understand the importance of their work as an element of a national technology innovation strategy. We do this by helping them understand the techno-economic yardsticks by which their prototypes may be judged.” Berkeley Lab staff discuss with each team their understanding of the “next best alternative” to their technology ideas and how they could beat the competition-by devising a more energy-efficient technology, a more cost-effective one, one that brings greater value to consumers, or some combination of all three.

“The Max Tech and Beyond Project also facilitates the market entry of successful prototypes developed by the teams through its Bridge to Market Program, a collaboration with UC Davis Entrepreneurship Academy,” says LBNL’s Karina Garbesi, Principal Investigator of the Max Tech and Beyond Project. Last year three of the competition’s successful teams—from the University of Maryland (UMD), Ohio State University (OSU), and the University of Nevada, Las Vegas (UNLV)—attended the academy. The academy helps faculty and students understand the business startup process and provides a network to support successful business development.

Zeeshan Mohammad, a UNLV team member in 2012-2013 who attended the UC Davis Entrepreneurship Academy last September, decided to re-enter the competition this year both because of his belief in the competition and how much fun the first year was. “I enjoyed working with a group of people who were as dedicated as myself to succeeding and working hard,” says Mohammad. He believes that his work on such a practical, real-life project has boosted his career prospects-and he hopes to see the voltage controller he helped prototype installed in a commercial appliance.

The Max Tech Project is also raising the profile of successful teams and prototypes by showcasing them at DOE’s Solar Decathlon Expo. Two Max Tech teams exhibited in October 2013. The UMD team demonstrated their ultra-efficient two-stage heat pump clothes dryer, which won them first place in last year’s Max Tech Competition. The runner-up team from OSU presented their hybrid air/water conditioner, which marries a heat pump to deliver air conditioning with a component that recovers waste heat from the air conditioning cycle to heat water.

DOE is looking forward to this year’s teams presenting the results of their prototype development and testing in the spring, with the winners announced in late summer, after final reports are submitted. The competition is already fierce—much fiercer than the Broncos-Seahawks rivalry—so stay tuned! In the meantime, student teams from colleges and universities across the country are strongly encouraged to gear up for the Max Tech Competition for the 2014-2015 academic year by responding to the upcoming request for proposals, which will be available later in February. For more information about the Max Tech Competition, email maxtech@dante.lbl.gov.

Paper Explores Plasmonic Energy Conversion for Photovoltaics and Photocatalytics

 Posted by Allan on February 10th, 2014

While costs for some solar photovoltaics (PV) have dropped sharply over the past few years, the search for PV modules that are both low-cost and highly efficient continues. An important focus of that quest is on developing lower-cost, more-efficient methods of attaining electron-hole separation. Plasmonic energy conversion, which generates “hot” (highly energetic) electrons in plasmonic nanostructures through the electromagnetic decay of surface plasmons, is one promising solution. It offers potentially high conversion efficiencies (greater than 22 percent) while keeping fabrication costs low, given the right materials, architectures, and fabrication methods.

César Clavero, an Environmental Energy Technologies Division researcher at Lawrence Berkeley National Laboratory, recently surveyed the research on those topics, to determine the current state of the technology. The resulting paper, “Plasmon-induced hot electron generation in nanoparticle/metal oxide interfaces for photovoltaic and photocatalytic devices,” was published in the 30 January 2014 issue of Nature Photonics.

The review, which covered fundamentals of hot electron generation, injection, and regeneration in plasmonic nanostructures, found that two key factors promote high conversion efficiencies: fast hot-electron injection before recombination and optimum carrier regeneration. The research also suggests that by combining multiple metals and conducting oxides, the devices will be able to generate electricity from more spectrums of solar radiation, thereby increasing electricity production.

Clavero found that material, size, and shape of the plasmonic nanostructures are the most important design factors affecting the localized surface plasmon resonance (LSPR) electron-generation processes. The literature also suggests that using plasmonic energy conversion could solve the problem of efficiency decreases at higher temperatures, which affects conventional PV cells, because the efficiency with plasmonic structures actually increases with temperature.

While titanium dioxide has been the most-used semiconductor for plasmonic energy conversion, Clavero suggests that the valence bands of zinc oxide, cerium oxide, and silver bromide could also make them efficient electron acceptors. Further studies will need to determine the most efficient semiconductor material.

This work was supported by the Department of Energy’s Office of Energy Efficiency and Renewable Energy, Office of Building Technology.

A Q&A with Cesar Clavero on Plasmonic Energy Conversion

In his review article published in Nature Photonics, Cesar Clavero, a researcher in the Environmental Energy Technologies Division, examines plasmonic energy conversion, a phenomenon that has only been known about for a few years. Clavero examines the speculation that plasmonic energy conversion could be harnessed in a new generation of photovoltaic materials that could be far more efficient at converting solar energy into electricity than what’s currently in the marketplace.

What is plasmonic energy conversion?

In plasmonic energy conversion, light from the sun, in the form of photons, are trapped in plasmonic nanostructures on the surface of a specially designed thin film. The photons of light of certain wavelengths form “surface plasmons” within these nanostructures.

Some of the time the light is just re-emitted as photons and radiated back to space. However, at other times, in a non-radiative process, the energy captured in the surface plasmons can be transferred to “hot electrons” and injected into a semiconductor to form an electric current. It has only been under a decade or so that researchers have thought this process could be harnessed into a more efficient way of generating electricity from solar energy.

Various research teams have observed this process taking place in particles of silver or gold deposited in tube nanostructures of titanium dioxide, however the use of other materials such as conducting oxides would extend the range of applicability of this technology.

What’s the difference between this process and how an electric current is generated in conventional photovoltaic panels on the market today?

In a conventional PV panel, photons in the sunlight that have high enough energy are absorbed by electrons in the semiconductor film that forms the photovoltaic panel. The process forms an “electron-hole pair.” The electrons become mobile, resulting in the electric current, and the positively charged “holes” in the lattice of the semiconductor material maintain the overall charge balance of the material. This process has a theoretical maximum energy conversion efficiency that cannot be exceeded by simple improvements to material.

Why is plasmonic energy conversion promising as a method of achieving higher efficiency of energy capture than in conventional semiconductor-based PV materials?

The physics of the plasmonic energy conversion process is fundamentally different from that of the photoelectric effect that generates current in the conventional PV panels. In plasmonic energy conversion, the process takes place on nanostructures, at the nanoscale. A surface plasmon-based photovoltaic material would be much thinner—instead of micrometers thick, it would be nanometers thick. This opens the possibility of PV panels with coatings that are much thinner and therefore considerably less expensive to manufacture than today’s panels, yet much more efficient at trapping energy.

Also, in the review article in Nature Photonics, I suggest that a wide range of metal oxides could use plasmonic energy conversion to capture energy from a broader range of wavelengths of the solar spectrum than are currently captured by conventional PV devices. Capturing energy across the whole solar spectrum—visible and infrared light helps increase the efficiency of these devices.

What are the barriers to exploiting plasmonic energy conversion in solar photovoltaic devices?

The field is in its infancy and there is much we don’t know about what materials are best at generating hot electrons from the solar spectrum, how to build and optimize nanostructures for maximum efficiency, and so on. But there are great opportunities for Berkeley Lab to explore a groundbreaking new field that could lead to the fabrication of much more efficient, cheaper solar PV devices. This research direction has the potential to cause a great leap in the use of solar photovoltaic technology to generate electricity.

What do you think are the next steps to advance this field?

A great window of opportunity has opened in the field of plasmonic energy conversion. The use of new plasmonic materials such as semiconductors and conducting oxides, combined with new architectures such as multijunction plasmonic solar cells, will allow us to further push the energy conversion limits while keeping low fabrication costs. Also, fundamental studies shining light onto the hot-electron generation, injection and regeneration processes will be key to advance this field.

Mark Wilson

 

Consortium for Energy Efficiency Members Take FLEXLAB Tour, Attend Energy Management Workshop

 Posted by Allan on February 10th, 2014

Lawrence Berkeley National Laboratory (Berkeley Lab) recently teamed with the Consortium for Energy Efficiency (CEE) to host a workshop on commercial energy management, and tours of Berkeley Lab’s new Facility for Low Energy eXperiments in Buildings (FLEXLAB). Tour attendees learned about how FLEXLAB can test the performance of emerging building technologies. About 40 people attended.

The workshop’s purpose was to address how Berkeley Lab and CEE are working together to integrate energy management information systems (EMIS) with approaches to whole-building-focused energy management programs. CEE introduced its plans to pursue a specification for a commercial energy management information systems, and the Sacramento Municipal Utility District (SMUD) and Pacific Gas and Electric (PG&E) conveyed experiences from their energy management programs. Berkeley Lab discussed its EMIS testing and research capabilities, as well as the need for third-party validation of key areas of EMIS performance. Berkeley Lab’s work on EMIS has been funded by PG&E in partnership with QuEST (Quantum Energy Services and Technologies), with continued support from the Department of Energy’s Building Technology’s Office.

Berkeley Lab’s Jessica Granderson co-led an afternoon session on energy baselining. It focused on what program administrators need to be able to leverage emerging tools and devices that promise to streamline the M&V process, and was followed by small group discussions. The day’s final session concentrated on quantifying the performance of whole-building and system-level measurement and verification approaches. A pre-workshop survey of participants confirmed that engineering calculations are applied much more often than before- and after-energy measurement approaches as a means of estimating savings from different energy conservation measures.

Mark Wilson

New Spectroscopic Technique Reveals the Dynamics of Operating Battery Electrodes

 Posted by Allan on February 10th, 2014

The following story appears on the website of Berkeley Lab’s Advanced Light Source. Three of the scientists involved in this research are based in Berkeley Lab’s Environmental Energy Technologies Division.

Developing high-performance batteries relies on material breakthroughs. During the past few years, various in situcharacterization tools have been developed and have become indispensable in studying and the eventual optimization of battery materials. However, soft x-ray spectroscopy, one of the most sensitive probes of electronic states, has been mainly limited toex situ experiments for battery research.

Recent ALS work could change this trend. Researchers have developed a new technique based on soft x-ray spectroscopy that could help scientists better understand and improve the materials required for high-performance lithium-ion batteries. The technique measures something never seen before: the migration of ions and electrons in an integrated, operating battery electrode.

Over the past few years, scientists have developed several ways to study the changes in a working electrode. These include techniques based on hard x-rays, electron microscopy, neutron scattering, and nuclear magnetic resonance imaging. But most of these methods track structural changes. They don’t track electron and ion dynamics directly, which is very important in the push to understand and optimize battery performance.

 

Read the rest here: http://www-als.lbl.gov/index.php/holding/882

Center of Expertise for Energy Efficiency in Data Centers’ New Website

 Posted by Allan on February 10th, 2014

he Department of Energy-led Center of Expertise demonstrates national leadership in decreasing the energy use of data centers. The Center partners with key influential public and private stakeholders and supplies know-how, tools, best practices, analyses, and the introduction of technologies to assist Federal agencies with implementing policies and developing data center energy efficiency projects.

The Center of Expertise is located at the Lawrence Berkeley National Laboratory. Berkeley Lab has long been recognized by industry as a leading source for technical expertise on energy efficiency in data centers. The Center partners with public and private organizations to advance joint initiatives benefiting both sectors; whereby the adoption and deployment of energy efficient technologies and best practices serves as a “new normal” for data center facilities managers.

View their new website at this link: http://datacenters.lbl.gov/.

Using Existing IT to Determine Office Occupancy and Reduce Energy Use

 Posted by Allan on February 10th, 2014

Placed thoughtfully and in sufficient number, building occupancy sensors can provide data sufficient to reduce building energy use, and the potential savings are enormous. However, the time and expense involved in installing and maintaining dedicated occupancy sensors can hinder their widespread use. But what if those savings could be achieved more simply and cheaply, without having to install building sensors, and the data needed to manage a building’s energy could be harvested from existing activities, such as keyboard use or the presence of a mobile phone?

Those were the questions that Bruce Nordman and Ben Rosenblum of Lawrence Berkeley National Laboratory’s (Berkeley Lab) Environmental Energy Technologies Department (EETD) sought to answer with collaborators Ken Christensen and Ryan Melfi (University of South Florida) and Raul Viera (University of Puerto Rico).

The project focused on automated solutions that could be used to reduce or switch off electricity to specific equipment when it is not being used. Rather than relying on explicit occupancy sensing (such as passive infrared or ultrasonic motion sensors), the team used implicit occupancy sensing that drew data from occupants’ interactions with the building’s existing IT infrastructure. One such strategy is to monitor network addresses in Wi-Fi access points and routers and correlate those addresses with the occupancy of a floor, area, or room of a building. Delivery of services such as HVAC and lighting could then be adjusted accordingly.

The team evaluated the feasibility of this approach at the Engineering Building at the University of South Florida and Berkeley Lab’s Building 90, and demonstrated an application of implicit sensing that showed its potential to sense the occupancy of a user workspace and automatically control its plugged-in devices. Three types of approaches were considered: those requiring only a data collection processing point, those that also required additional software, and those that required the data collection processing point and additional software and hardware.

The study showed that no-cost, implicit occupancy sensors were available within the existing buildings and demonstrated the feasibility of implicit occupancy testing. Advantages of using implicit sensors included: avoided cost for sensor installation and maintenance, availability of sensor readings over existing IT networks, and a degree of occupancy resolution (count, identity, and activity) not available from dedicated sensors.

Future work will focus on identifying the level of accuracy needed for optimal control of various equipment and demonstrate directly using the implicit sensing data in building controls.

“Using Existing Network Infrastructure to Estimate Building Occupancy and Control Plugged-in Devices in User Workspaces.” Christensen, Ken, Ryan Melfi, Bruce Nordman, Ben Rosenblum, and Raul Viera. International Journal of Communication Networks and Distributed Systems. January 2014. Vol. 12, No. 1, pp. 4-29.

Mark Wilson

http://nordman.lbl.gov/

 

Energy Department Updates Home Energy Scoring Tool for Advancing Residential Energy Performance

 Posted by Allan on February 10th, 2014

From the U.S. Department of Energy:

As part of the Energy Department’s commitment to helping families across the United States save money by saving energy, the Department announced today its first major software update to the Home Energy Scoring Tool, developed by the Department’s Building Technologies Office and Lawrence Berkeley National Laboratory.

The Home Energy Score allows homebuyers to compare homes on an “apples to apples” basis and provides recommendations for energy efficiency improvements. In addition, homeowners and homebuyers receive a cost-saving estimate of how these improvements could reduce utility bills and improve a home’s score. This provides homebuyers with the opportunity to undertake energy investments when improvements are most likely to take place—at time of purchase or within the first year of owning a home.

Through the Home Energy Scoring Tool, more than 8,500 homes have been scored by the Energy Department’s growing network of more than 25 partners and 175 qualified assessors.

After more than a year of implementation and feedback from program partners, the Energy Department made significant improvements to the scoring tool’s calculation methodology and user interface. To inspire greater investments in energy efficiency, the tool also provides more detailed and cost-effective recommendations to help consumers further improve their home’s energy efficiency.

The updated tool is more sensitive to local climate—collecting data from more than 1,000 weather stations nationwide compared to the 250 stations used previously. The Energy Department, through LBNL, is working with third-party software companies to license the Home Energy Score application programming interface (API) to build apps and other online resources that exchange data with the tool.

http://energy.gov/eere/buildings/home-energy-score

http://apps1.eere.energy.gov/news/progress_alerts.cfm/news_id=21193

Identifying Energy Efficiency Opportunities for Small Data Centers

 Posted by Allan on February 10th, 2014

Glossy photos in magazines and on the web tend to portray server rooms as large spaces with gleaming, symmetrical rows of servers on temperature-controlled racks. In reality, however, 57 percent of U.S. servers are housed in small spaces such as server closets and localized data centers, in what are commonly referred to as small server rooms. Such spaces comprise 99.3 percent of all server spaces in the United States. In contrast to large, consolidated server operations, which pursue energy efficiency as a strategy to minimize their operational costs, these small, decentralized server operations do not, and because their configurations are site-specific, it is challenging to develop efficiency measures that can be used widely.

Given the great potential for energy savings, researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) investigated how IT equipment was deployed, powered, and operated in small server rooms, and developed strategies to improve energy efficiency. The team surveyed 30 small server rooms across eight San Francisco Bay Area institutions, ranging from high-tech companies to academic and health care institutions, local governments, and a small business.

“We took a half hour to walk through each space with the owner or operator and collect data on how the equipment was configured and run,” explains Environmental Energy Technologies Division (EETD) researcher Iris Cheung. “Our hosts also provided background information on the room, to give us a sense of how the current configuration came to be. This helped us to identify potential barriers to energy efficiency improvements.”

Common Attributes, Missed Opportunities

Some commonalities arose as server spaces were surveyed; the most prominent being that most were not originally intended to operate as server spaces, and therefore, the efficiencies inherent in dedicated server spaces were not present. Many also suffered from principal-agent problems, meaning that the utility bill was paid by someone other than the server operators or owners. In addition, server energy use was often not submetered, so server operators received little or no feedback on their energy cost, and therefore had no incentive to pursue energy-efficiency improvements.

Also notable was that company business operations often took precedence over energy efficiency concerns. These priorities were reflected in limited IT budgets that left older, less-efficient equipment in place longer. And even though consolidating the servers could have greatly improved energy efficiency, the motivation to do this was low because often the equipment was not used regularly, they couldn’t visualize the potential cost savings, and/or the server owners wanted to keep the equipment close to them.

Not surprisingly, server cooling turned out to be an area that offered potential for great improvement. Many server room temperature set points were lower than necessary, so energy was wasted by overcooling. Some used dedicated cooling systems that ran 24/7, even at temperate climates, while others set the building’s cooling lower than it otherwise would be, to ensure that the ambient air was cool enough to cool the servers. Few had separate hot and cool areas to minimize hot/cold air mixing and improve cooling efficiency, as is often the case in large server spaces, and none took advantage of cooler outside air to reduce the amount of mechanical cooling required.

A Closer Look Reveals Significant Inefficiencies

Once the surveys were complete, the team selected four sites-one at Berkeley Lab, one at the city of Walnut Creek, and two at Stanford University-for detailed assessments. “We chose these spaces because they broadly represented other small server room configurations and had high potential for efficiency improvements, good site access, and operator interest.”

The goal of assessing those four sites was to examine the IT infrastructure and systems in more detail. Using data-logging power meters on the circuits that supplied power to the equipment, the team measured IT, cooling, and other power-consuming equipment in each space, to determine actual power consumption and efficiency opportunities. To calculate power usage effectiveness (PUE), the researchers measured total server room power use, including lighting, power distribution, and uninterruptible power supply (UPS) losses wherever possible; estimating power consumption or losses if measurements were not possible (see table).

“We found that PUE values ranged from 1.5 to 2.1,” says Cheung. “So in the upper range, the server room’s total power usage was about twice the amount of power used by its IT equipment.”

PUE Breakdown for the Four Sites
Server Room Stanford, University
333 Bon Air Siding
Stanford, University
Alumni Center
LBNL 90-2094 City of Walnut Creek
Cooling, kW 8.5 1 5.5 2 3.3 1 14.9 1
Lighting, kW 0.8 2 0.1 2 0.1 2 0.1 2
UPS loss, kW 1.8 2 1.7 2 0.1 2 1.3 1
Total load, kW 21.3 17.2 10.4 31.3
PUE 2.1 1.8 1.5 2.1

1 Directly measured
2 Assumed or estimated

Identifying Energy Efficiency Strategies

Because businesses and institutions using small server rooms often have limited resources, the project first focused on low- or no-cost measures for improving their server room’s energy efficiency, which included raising cooling set points and better airflow management. More involved but still cost-effective measures included server consolidation and virtualization, and dedicated cooling with economizers.

Cheung explains: ” We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitations, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers rather than server rooms. Otherwise, backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space. Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.”

Spreading the Word

To reap the significant energy and cost-saving potential for small server rooms, it’s necessary to communicate the energy-efficiency benefits (and how to achieve them) widely across the sector. So the team also presented its findings to server room operators, data center energy-efficiency professionals, industry organizations, utilities, and product vendors, with specific efficiency measures that could be applied to other server spaces. Collaborating with Stanford University and the Natural Resources Defense Council, Berkeley Lab also developed a simple and more detailed version of a fact sheet that summarizes energy-saving solutions for small server room owners and operators (see below). In addition, they conducted workshops at data center conferences.

“Much of the inefficiency in small data centers can be traced to a lack of education about energy-efficient equipment and operation among server owners and operators,” says Cheung. “By training operators in energy-efficient IT, cooling, and power distribution, and by facilitating increased energy-efficiency visibility, server room energy efficiencies could improve significantly. We hope to build on this work by evaluating more of these spaces, identifying better, cost-effective tools to track server utilization, and by developing case studies that operators can use to increase the energy efficiency of their server rooms.”

Cheung, H. Y. Iris, Steve E. Greenberg, Roozbeh Mahdavi, Richard Brown, and William Tschudi. 2013. Energy Efficiency in Small Server Rooms. California Energy Commission.

Mark Wilson

 

No Evidence of Residential Property Impacts Near Wind Turbines According to Third Berkeley Lab Study

 Posted by Allan on February 10th, 2014

Lawrence Berkeley National Laboratory (Berkeley Lab) along with University of Connecticut analyzed more than 122,000 home sales near 26 wind facilities (with over 1,500 within a mile of operating turbines) in densely populated Massachusetts, yet was unable to uncover any impacts to nearby home property values.

“This is the third of three major studies we have conducted on this topic [the first was published in 2009, and the second last August], and in all studies [using three different datasets] we find no statistical evidence that operating wind turbines have had any measureable impact on home sales prices,” says Ben Hoen, the co-author of the new report.

Hoen is a researcher in the Environmental Energy Technologies Division of Berkeley Lab.

One of the unique contributions of this most recent study is that impacts from turbines as well as a suite of other environmental amenities and disamenities were investigated. The study found strong evidence that highways, major roads, electricity transmission lines, open space and beaches impact property values, but no similar evidence was uncovered for turbines.

“When we find our model so accurately predicts impacts from other amenities and disamenities, we are considerably more confident of our findings for turbines”, says lead author Carol Atkinson-Palombo, Assistant Professor in the Department of Geography of the University of Connecticut.

This study, the most comprehensive to-date, in terms of numbers of transactions, builds on both the previous U.S.-wide Berkeley Lab studies as well as a number of other academic and published U.S. studies, which also generally find no measureable impacts near operating turbines.

“Although there have been claims of significant property value impacts near operating wind turbines that regularly surface in the press or in local communities, strong evidence to support those claims has consistently failed to materialize in all of the major U.S. studies conducted thus far”, says Hoen.

The research was supported by the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy and by Massachusetts Clean Energy Center.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.

Created by the Green Jobs Act of 2008, the Massachusetts Clean Energy Center (MassCEC) is dedicated to accelerating the success of clean energy technologies, companies and projects in the Commonwealth—while creating high-quality jobs and long-term economic growth for the people of Massachusetts. Since its inception in 2009, MassCEC has helped clean energy companies grow, supported municipal clean energy projects and invested in residential and commercial renewable energy installations creating a robust marketplace for innovative clean technology companies and service providers.

Additional Information:

Download the new 2014 UConn / LBNL report ”Relationship between Wind Turbines and Residential Property Values in Massachusetts”

To register for a related webinar on the New 2014 LBNL / UConn Report at 12:30 PM Eastern Time, January 22nd, 2014 go here: Webinar

Download the 2013 LBNL report ”A Spatial Hedonic Analysis of the Effects of Wind Energy Facilities on Surrounding Property Values in the United States”

Download the 2009 LBNL Report ”The Impact of Wind Power Projects on Residential Property Values in the United States: A Multi-Site Hedonic Analysis”

For more information on the report, contact Ben Hoen (bhoen@lbl.gov, 845-758-1896) or Carol Atkinson-Palombo (carol.atkinson-palombo@uconn.edu, 860-486-3023).

Buildings, Energy, Greenhouse Gas, Industrial and Policy Modeling and Simulation Tools Available from Energy Analysis and Environmental Impacts Department

 Posted by Allan on February 10th, 2014

Tools and models to find the best way to save energy and reduce greenhouse gas emissions in cities and industries, to follow the transport of pollutants through the environment, and to calculate the cost of power interruptions are among those available on a new Lawrence Berkeley National Laboratory (Berkeley Lab) web site.

The site brings together models and simulation tools developed by the Energy Analysis and Environmental Impacts (EAEI) Department of the Lab’s Environmental Energy Technologies Division.

“Our hope is that the site will facilitate greater technical awareness of the many analytical tools we have developed over the years, potentially leading to new opportunities for cooperation among stakeholders and sponsors,” said Charles H. Goldman, Leader of the Energy Analysis and Environmental Impacts Department.

The site lists tools according to research area (technology environment, economics), the relevant energy sector (buildings, industry, power transportation, cross-sector).

A search tool in the left hand margin of the page allows users to search for relevant tools by research area, sector, and the type of user who might be interested in the tool: industry practitioners, academic institutions, policy makers, state regulators, and utilities. By checking off the boxes under area, sector, and user type (or research group within the Department), the user can create a customized list of tools geared to his or her own interests.

The search tool helps users go directly to the tool with the capabilities they need, rather than search through a variety of pages.

The variety of tools available is broad and reflects the work conducted by dozens of researchers in the EAEI Department over many years of effort. Included in the 40 available tools are a tool for analyzing distributed energy resources, a utilities tariff analysis, a tool for analyzing energy-efficient retrofit alternatives for commercial buildings, and tools for analyzing the energy efficiency gains and greenhouse gas reductions of various types of measures in a variety of industries including pulp and paper, steel, and textiles.

http://eaei.lbl.gov/tools