Special Feature: 5 Questions for Grid Integration Group’s Sila Kiliccote

 Posted by Allan on December 9th, 2013

Standardizing Grid Signals for integrated demand side management

Sila Kiliccote is Acting Leader of the Grid Integration Group at Lawrence Berkeley National Laboratory (Berkeley Lab). She has been a part of the automated demand response team developing an automated communication infrastructure, integrating it with building control systems and working with stakeholders to standardize the information model. Her areas of interest include characterization of buildings and demand reduction, demand responsive lighting systems, building systems integration and feedback for demand-side management. She has an MS in Building Science from Carnegie Mellon University and a BS in Electrical Engineering from University of New Hampshire.

Her research area is smart grid/buildings to grid/automated demand response. She is also Deputy of the California Energy Commission-funded Demand Response Research Center, located at Berkeley Lab.

 

Q: How would you describe what you do?

A: People want to operate their buildings at the lowest energy costs they can. What we’ve done is to develop a standard communication format for delivering electricity price signals to buildings to facilitate automated energy decision making.

Our research facilitates the delivery of price signals and reliability signals to buildings so that the building control systems can make decisions about their operations based on cost of electricity. Our research also provides information on how building systems can respond to these signals. Providing price information and related controls options can save building owners and operators a considerable amount of money.

I always say, “no loads left behind.” We want to make sure that any load that has a little flexibility can participate in grid transactions. For example, take pumps in an industrial setting like a wastewater facility. During a time when the price of electricity is really high, we want to operate the pumps at lower speed and slow our production to ride through the high-price period. Or, for a cooling system, we can use a signal to tell the building energy management system to adjust the zone temperatures a little higher in the summer when prices are at a peak, and then adjust it back when prices go down.

 

Q: How did you become interested in building systems integration?

A: In the 1990s I worked on the integration of building systems as a researcher at Carnegie Mellon University. There were certain barriers that were difficult to overcome in building systems at that time. For example, before we could really think about integration, efficiency of building equipment and operations needed to be increased. In the 1990s, “efficiency” meant efficient technologies, not necessarily efficient operations.

In 2004, I came to Berkeley Lab. With more complex systems being developed by then, like building energy storage and generation, as well as an increased complexity and availability of energy data, I knew that it was unrealistic for us to think about operating building interactions manually. This led to my interest in energy-efficient operations.

I believe there is a huge new opportunity to re-evaluate energy-efficient operations of buildings now with increased use of behind-the-meter generation and storage systems. Building automation systems can reposition these systems as an integration platform and become true energy management and control systems.

 

Q: What part of your work is the most fun?

A: We have a vision of continuous automated energy management behind the meter with continuous price and reliability signals from the electricity grid. We can’t realize that vision unless we push the boundaries in markets and technologies.

It’s fun to think about the technologies that can help us realize this vision—to develop, test, and demonstrate what’s needed. For example, I’m involved in a project where we’re pushing boundaries in telemetry (automating communications to gather measurements and other data) by optimizing a device that collects energy measurements and communicates the data all in the same device. In 2010, the cost of this technology per site was $70,000. In 2011, we lowered the cost to $100 per site. Now we are at $85.

There are other applications of the same concept into other areas: for example, we’re working with the U.S. Department of Defense on electric car fleet management, using prices and setpoint information from the California Independent System Operator to charge and discharge 40+ electric vehicle batteries.

 

Q: What changes have you seen in your field during the years you’ve been doing your work?

A: I started working at Berkeley Lab in 2004. While there was a lot of interest in demand response in California at the time, real changes came to the field when the Smart Grid concept was popularized and stimulus money started to flow into projects. People started thinking about the technologies that were needed to support the Smart Grid concept. That’s when price of electricity became an issue.

The Smart Grid stimulus has been a great way for people to understand the relationship between loads and prices and the infrastructure needed for communications and controls. This time period also accelerated the development and adoption of standards including Open Automated Demand Response, an information exchange model, developed at Berkeley Lab and used for grid transactions. There are many companies with innovative products and services that build on technologies and analytics used in other fields but that are new in the energy arena. Ten years ago, Google was a search engine company, but now they have a keen interest in energy and resources to potentially change the technology landscape.  Compared to almost 10 years ago, there is also a better understanding of rates by customers. Most large customers understand their rates and control their peak demand to lower their risk of increasing demand charge costs.

 

Q: What’s ahead for your work?

A: The way I see it, there will be a lot of attention to optimization of systems behind the meter as well as beyond the meter at the distribution system level. Our distribution system is not well instrumented, so we don’t understand how energy moves around on it. We’re not sure what the effects of climate issues, and more severe weather, might be on maintaining and using these services. There is a lot of integrated optimization work, along with technology development, and analysis, markets, and policy work to support adoption at scale.

I’m excited; we’re already working toward that with technologies like distributed energy resources, distributed controls, and communication. The challenge is to consider the system as a whole and develop an holistic approach to research for affordable, reliable, resilient clean energy systems adoption.

 

Q: If you could get one message across, what would it be?

A: To create impactful research, simply developing the technology and proving that it works is not enough. To have real impact, we need to be engaged with all stakeholders in the innovation chain and to form partnerships that can be leveraged to create impact. Berkeley Lab’s proximity to Silicon Valley as well as to world-renowned institutions is a unique opportunity to get innovators involved with our science.

 

Contact:

Sila Kiliccote

Grid Integration Group

Demand Response Research Center (DRRC)

Lawrence Berkeley National Laboratory

1 Cyclotron Road MS 90R1121

Berkeley CA 94720

SKiliccote@lbl.gov

 

Five Questions for Sustainable Energy Systems Group’s Michael Sohn

 Posted by Allan on November 27th, 2013

Assessing Probability and Uncertainty for Smarter Decisions and Policies

In this first of a series of Q&As with researchers in the Environmental Energy Technologies Division, Michael Sohn talks about probability and uncertainty in modeling.

Michael Sohn is an environmental engineer working to understand how chemicals and energy are used in the world at various scales—global, state, and building. His research interests are in mathematical modeling of environmental systems and quality, uncertainty analysis, value-of-information decision analysis, water-energy integrated assessment, and sensor-data fusion.

Sohn has a PhD in Civil and Environmental Engineering and an MS degree in Engineering and Public Policy from Carnegie Mellon University. He also has MS and BS degrees in Mechanical Engineering. Before coming to Lawrence Berkeley National Laboratory (Berkeley Lab) 15 years ago, Mike worked at an environmental engineering firm where he conducted environmental health risk assessments. At Berkeley Lab’s Environmental Energy Technologies Division, he is deputy leader of the sustainable energy systems group and former leader of the Airflow and Pollutant Transport Group.

Q: Your work has to do with computer modeling, particularly creating algorithms that assess uncertainty. Why is assessing uncertainty so important?

A: Let’s say we believe that children are being exposed to pesticides but we don’t know the route. We can measure the toxins in a child, but how do we estimate the exposure pathways? There is no exact answer: but we can guess that exposures are probably from multiple pathways, and also that there is variability (or noise) in the measurements. The best we can do is to describe the likely, or predominant, pathways. Assessing uncertainty helps us understand how much (and how little) we know from the existing measurements and whether additional measurements are likely to improve our analysis.

Another example, relevant to my current work in buildings: suppose I manage many buildings of similar type, say 50 or 60 across the United States. Suppose I want to weigh the costs and benefits of exchanging the heating systems in the buildings to ones that are more efficient. Suppose also the cost of analyzing each building, one by one, is cost prohibitive. Well, there is no exact solution, because one doesn’t exist. But that might be okay. If we can provide a strong estimate of the range (or uncertainty) in the expected energy change, the building owner, a financier, or portfolio manager will have the needed information to assess financial risk in their cost-benefit analysis. Uncertainty assessment is often key to making good policy decisions.

Or, in another project I’ve worked on, we tried to determine the optimal placement of air monitoring sensors in buildings. We wanted to know: where to do I place the sensors to maximize the probably of detecting some unforeseen chemical spill? This, too, turns out to be an uncertainty assessment problem, requiring an assignment of probabilities to spills that might occur and also assessing the quality of the measurements that sensors might return. In this project, we placed a sensor near a location that was highly likely to have a spill, but the sensors in that location returned highly variable or noisy measurements. My colleagues and I had to develop an uncertainty assessment algorithm, based on Bayesian statistics, to maximize the likelihood of detecting the spills from one, two, three, or many sensors.

Q: What is Bayesian statistical analysis?

A: Bayesian statistics is a rather old statistical concept that has gained a lot of traction in the past few decades because of the vast availability of fast computers. The concept is powerful for solving decision analysis problems. These days, most engineering and statistical departments have an expert in Bayesian statistics, whereas only certain universities were known for this in the past. A key benefit of this kind of analysis is that it provides a transparent and quantitative approach to analyzing noisy and sometimes erroneous physical data. It’s quite the cat’s meow lately. It’s applied in many diverse topics (web search engines, predicting election outcomes) but of course my interests are in applying these methods to understand physical and energy systems.

Q: How did you become interested in the idea of uncertainty?

In my graduate studies, I lost interest in developing the more complex environmental models of the time, because I saw that these models were getting more and more difficult to verify or refute. I realized that an important gap in the field was the need for methods for computing how little one knows about the physics and determining what data was needed to reconcile our lack of knowledge. I also saw that in many cases these complex models were not needed for decision making, and that a transparent and tractable model was often better or good enough … that is if we could also provide an assessment of uncertainty in the model predictions.  Of course complex models are extremely important in certain domains of study, just not for all analyses.

I also became very interested in probabilistic risk assessments. I realized that with faster computers becoming available I could compute uncertainty for much more complex models and amounts of experiment data. These interests have led me to a research area called “value of information”—a fairly old but underutilized field in economics, and perhaps unappreciated in the physical sciences. For example, do I spend a lot of money to get one precise measurement, or a whole lot of coarse measurements? What’s the tradeoff there?

In the work I do now, considering retrofits for energy-efficient equipment in buildings, I try to create ways to answer the questions that a building operator or owner might have: like, how much energy will I save if I change my HVAC equipment, and how much will I have to pay to do it? Is it beneficial to replace the equipment based on my expected energy savings? If I have great uncertainty in energy savings, I want to assess the “risk” for spending that money.

Q: What is the most innovative thing you see happening right now in your field?

A: People are bandying the term “big data” around right now—but what does it mean? In the energy world, it can mean measuring everything about energy and buildings. Some people might think that “big data” is going to be the solution for many of our energy analysis questions—at building or national scale. I’m cautious. I think we need to get our hands around “big data”—and an important research area will be to explain what big data is for the applied energy fields.

Q: Where do you see your work going in the future?

A: I consider myself a scientist, and the “science” of the field is still developing. There are still many needs for developing algorithms and computer systems, making them approachable for non-technical users, and applying the questions about big data on current issues.

I’m also an engineer and am focused on applied energy problems. I see myself working more toward policy issues, the energy distribution at the grid level, and understanding the interplay between water and electricity demands. We’re also looking at the potential impacts of climate change. There is great uncertainty and many unknowns about future climates and how they will affect the need for energy and water. Which means that we need to make decisions now that preplan against possible consequences. I suppose these applications have a recurring theme: we use measurements to understand and forecast the future of physical systems. I’m interested in what measurements we can use and need to sound forecasts, allowing decision makers to have the best information they can to make important decisions.

 

Contact:

Michael Sohn

Sustainable Energy Systems Group

Lawrence Berkeley National Laboratory

1 Cyclotron Road MSSS 90R2002

Berkeley CA 94720

mdsohn@lbl.gov

 

Holistic Cell Design by Berkeley Lab Scientists Leads to High-Performance, Long Cycle-Life Lithium-Sulfur Battery

 Posted by Allan on November 19th, 2013

Battery could find use in mobile applications, and eventually, electric vehicles with 300-mile range

Researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory  (Berkeley Lab) have demonstrated in the laboratory a lithium-sulfur (Li/S) battery that has more than twice the specific energy of lithium-ion batteries, and that lasts for more than 1,500 cycles of charge-discharge with minimal decay of the battery’s capacity. This is longest cycle life reported so far for any lithium-sulfur battery.

Demand for high-performance batteries for electric and hybrid electric vehicles capable of matching the range and power of the combustion engine encourages scientists to develop new battery chemistries that could deliver more power and energy than lithium-ion batteries, currently the best performing battery chemistry in the marketplace.

For electric vehicles to have a 300-mile range, the battery should provide a cell-level specific energy of 350 to 400 Watt-hours/kilogram (Wh/kg). This would require almost double the specific energy (about 200 Wh/kg) of current lithium-ion batteries. The batteries would also need to have at least 1,000, and preferably 1,500 charge-discharge cycles without showing a noticeable power or energy storage capacity loss.

“Our cells may provide a substantial opportunity for the development of zero-emission vehicles with a driving range similar to that of gasoline vehicles.” says Elton Cairns, of the Environmental Energy Technologies Division (EETD)

The results were reported in the journal Nano Letters, in a paper authored by Min-Kyu Song (Molecular Foundry, Berkeley Lab), Yuegang Zhang (Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences) and Cairns (Environmental Energy Technologies Division, Berkeley Lab). The research was funded by the U.S. Department of Energy’s Office of Science, Basic Energy Sciences, and a University of California Proof of Concept Award.

 

Benefits of lithium sulfur, and challenges

“The lithium-sulfur battery chemistry has attracted attention because it has a much higher theoretical specific energy than lithium-ion batteries do,” says Cairns. “Lithium-sulfur batteries would also be desirable because sulfur is nontoxic, safe and inexpensive,” he adds. Li/S batteries would be cheaper than current Li-ion batteries, and they would be less prone to safety problems that have plagued Li-ion batteries, such as overheating and catching fire.

Development of the lithium-sulfur battery also has its challenges. During discharge lithium polysulfides tend to dissolve from the cathode in the electrolytes and react with the lithium anode forming a barrier layer of Li2S. This chemical degradation is one reason why the cell capacity begins to fade after just a few cycles.

Another problem with Li/S batteries is that the conversion reaction from sulfur to Li2S and back causes the volume of the sulfur electrode to swell and contract up to 76 percent during cell operation, which leads to mechanical degradation of the electrodes. As the sulfur electrode expands and shrinks during cycling, the sulfur particles can become electrically isolated from the current collector of the electrode.

 

Holistic cell design addresses chemical and mechanical degradation

The prototype cell designed by the research team uses several electrochemical technologies to address this array of problems. The cathode is composed of sulfur-graphene oxide (S-GO), a material developed by the team that can accommodate the volume change of the electrode active material as sulfur is converted to Li2S on discharge, and back to elemental sulfur on recharge.

To further reduce mechanical degradation from the volume change during operation, the team used an elastomeric binder. By combining elastomeric styrene butadiene rubber (SBR) binder with a thickening agent, the cycle life and power density of the battery cell increased substantially over batteries using conventional binders.

(See http://eetd.lbl.gov/news/article/56320/sulfur-graphene-oxide-material-for-lithium-sulfur-battery-cathodes)

To address the problem of polysulfide dissolution and the chemical degradation the research team applied a coating of cetyltrimethyl ammonium bromide (CTAB) surfactant that is also used in drug delivery systems, dyes, and other chemical processes. CTAB coating on the sulfur electrode reduces the ability of the electrolyte to penetrate and dissolve the electrode material.

Furthermore, the team developed a novel ionic liquid based electrolyte. The new electrolyte inhibits polysulfides dissolution and helps the battery operate at a high rate, increasing the speed at which the battery can be charged up, and the power it can deliver during discharge. The ionic liquid-based electrolyte also significantly improves the safety of the Li-S battery, as ionic liquids are non-volatile and non-flammable.

The battery initially showed an estimated cell-specific energy of more than 500 Wh/kg and it maintained it at >300 Wh/kg after 1,000 cycles—much higher than that of currently available lithium-ion cells, which currently average about 200 Wh/kg.

“It’s the unique combination of these elements in the cell chemistry and design that has led to a lithium-sulfur cell whose performance has never been achieved in the laboratory before—long life, high rate capability, and high cell-level specific energy,” says Cairns.

The team is now seeking support for the continuing development of the Li/S cell, including higher sulfur utilization, operation under extreme conditions, and scale-up. Partnerships with industry are being sought.

The next steps in the development are to further increase the cell energy density, improve cell performance under extreme conditions, and scale up to larger cells.

“A long-life, high-rate lithium/sulfur cell: a multifaceted approach to enhancing cell performance,” in Nano Letters, by Min-Kyu Song (Molecular Foundry, Berkeley Lab), Yuegang Zhang (Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences) and Cairns (Environmental Energy Technologies Division, Berkeley Lab).

This research was funded by the U.S. Department of Energy’s Office of Science and a University of California’s Proof of Concept Award.

###

The Molecular Foundry is one of five DOE Nanoscale Science Research Centers (NSRCs), national user facilities for interdisciplinary research at the nanoscale, supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize, and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE’s Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos national laboratories.  For more information about the DOE NSRCs, please visit science.energy.gov/bes/suf/user-facilities/nanoscale-science-research-centers/.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov

Berkeley Lab and University of New Mexico Collaboration Seeks to Improve Management of Self-Powered Private Facilities on the Electric Grid

 Posted by Allan on November 13th, 2013

November Special Focus: Energy Efficiency, Buildings and the Electric Grid

Distributed Energy Resources Expected to Grow Rapidly as Cost of Renewable Power and Natural Gas Declines 

Scientists in the Environmental Energy Technologies Division of Lawrence Berkeley National Laboratory are working with the University of New Mexico to ease the way for the seamless integration of self-generated electricity, thermal energy or both (known as distributed energy resources, DER) into buildings. The Cloud-based Energy Resource Scheduling (CERES) initiative at the University of New Mexico recently received a $250,000 grant from PNM, New Mexico’s largest electric utility to advance DER research and training.

DER enables private companies, universities and other public institutions, as well as homes to generate their own power from sources as varied as solar panels or wind turbines, combined heat and power plants, or simple natural gas or diesel generators. Energy can also be stored and discharged later when needed, for example, through thermal energy storage, stationary battery systems, or electric vehicle batteries. DER can reduce energy bills for the owner, provide back-up power in the event of grid disturbances, and support the public grid if properly managed.

UNM will use EETD’s DER-CAM (Distributed Energy Resources Customer Adoption Model) software to optimize the scheduling of building onsite generation resources to minimize the cost of operation, the carbon emissions of the facility, or both. The project will train graduate students in UNM’s Center for Emerging Technologies in how to set up and use DER-CAM to manage on-site power generation, purchases from the electric grid, and building thermal systems for heating and cooling.

“The use of distributed energy resources is going to grow as the cost of both renewable and natural-gas fired power comes down, and large facilities and institutions gain experience with DER technologies,” says Michael Stadler in EETD’s Grid Integration Group. “The DER-CAM system we’ve developed at Berkeley Lab is designed to solve complex optimization problems and produces a dispatch that allows DER to mitigate the impact on the electric grid, minimize the cost to the operator of the facility, and lower the carbon footprint.”

The CERES program is a cloud-based system that communicates with buildings and facilities that use distributed energy resources. It will schedule the operation of these resources using output from the DER-CAM model to produce maximum benefit to the local energy generators at minimum costs and emissions, and optimized benefits of heating and cooling for the consumers of energy.

UNM researchers worked with PNM to identify large commercial buildings that use energy at different times of the day. They will collect data to study how well the system works, and estimate energy saved by using the system. A preliminary study of the UNM Mechanical Engineering Building suggests savings of five to 10 percent of the utility bill are very realistic. Some results suggest savings up to 30 percent.

Students at UNM will collect data and learn to operate the system with the help of UNM facility managers. A technical advisory team from Berkeley Lab-EETD led by Michael Stadler and Chris Marnay will provide assistance to the UNM team on the operation of the DER-CAM software. UNM graduate students have already spent time at Berkeley Lab learning how to use DER-CAM.

The results from this research will not only help building owners, but also train UNM students the management of distributed energy resources and help Berkeley Lab researchers improve DER-CAM’s ability to interface with buildings and distribution systems.

“A new business model is developing in the building energy management world,” says Stadler, “which is witnessing the merger of information technology with a managed approach to energy generation and consumption. One of the outgrowths of this model will be the improved integration of distributed energy in the electric grid. Another is that the public electric grid will be better able to use excess power from private self-generated sources and avoid potential problems.”

Chris Marnay adds: “As we drive down energy use in buildings, systems are becoming more complex, requiring sophisticated optimization to capture their full benefits. Our simultaneous desire to lower the environmental footprint of buildings while constraining energy bills will require very clever operational strategies, which is what our work is endeavoring to provide in an easily implemented form.”

—Allan Chen

http://microgrid.lbl.gov/der-cam

http://news.unm.edu/news/unm-pnm-work-together-on-energy-savings

The Retrocommissioning Sensor Suitcase Brings Energy Efficiency to Small Commercial Buildings

 Posted by Allan on November 12th, 2013

November Special Focus: Energy Efficiency, Buildings and the Electric Grid

Most buildings in the U.S. don’t perform as energy-efficiently as they could simply because energy-using equipment in the building have never been set up to maximize energy performance. Thermostat setpoints are too low or too high, so rooftop units (RTUs) cool buildings down below recommended temperatures, or keep them too warm (or both). Or, there is no difference in the setpoint during hours when the building is unoccupied versus occupied—turning the heat and space conditioning down during unoccupied hours helps lower energy bills substantially. Lights may be left on at night when no one is in the building, or there may be daytime opportunities in spaces that are not continuously occupied.

These are only a few of the problems that energy performance professionals see in the field, problems they can correct through retrocommissioning—the process of assessing the energy performance of an existing building, and then tuning its systems, and implementing no or low-cost energy efficiency improvements. When this is done to a new building, it is called commissioning. Research published in 2009 by scientists at Lawrence Berkeley National Laboratory (Berkeley Lab), demonstrated that in a large sample of existing buildings, retrocommissioning could save as much as 15 percent of a building’s annual energy use, and pay for itself in less than a year, through the resulting utility cost savings.

In large commercial buildings, where the cost-effectiveness of this process is highest, retrocommissioning is beginning to become more common, thanks to growing awareness of its economic benefits to building owners and operators, as well as a thriving industry of building energy performance professionals.

In smaller commercial buildings efficiency efforts, including retrocommissioning have been hampered by several factors. “Small commercial buildings do not typically have budget or business economics that allow investing in enhancements such as comfort and energy improvements,” says Jessica Granderson, a scientist in the Environmental Energy Technologies Division of Berkeley Lab. “They also don’t have in-house staff with the expertise in building systems who can perform retrocommissioning or identify improvement opportunities.”

Granderson, the Deputy Leader of EETD’s Building Technologies and Urban Systems Department, is working with Michael Brambley of Pacific Northwest National Laboratory to develop a technological solution: the Retrocommissioning Sensor Suitcase.

“The Suitcase,” she says, “is a turn-key hardware and software solution that non-experts can use to generate low or no-cost recommendations automatically on how to improve a building’s operating costs, comfort and energy performance.” The project is funded by the Department of Energy’s Office of Energy Efficiency and Renewable Energy, Building Technologies Office.

“The Retro-commissioning Suitcase project is a DOE funded project to reduce the cost of delivering cost effective, energy savings retro-commissioning services to small and medium sized buildings,” says George Hernandez, Chief Engineer, Building Technologies Office in the Department of Energy. This project is accomplished by ‘embedding’ the knowledge and skills of a highly experienced building commissioning practitioner into a scalable hardware and software package that can be easily deployed by a variety of building services personnel to make it easier for building owners and operators reap the benefits and cost savings for building commissioning.”

The turnkey under development in this joint Berkeley Lab-PNNL project contains a set of different types of portable, easy-to-install building sensors, a handheld smart pad for documenting the location, placement and sensor type, a battery, and a data control module that can receive and pre-process data from the sensors, which are distributed throughout the building. The data module communicates wirelessly with the smart pad, which launches sensors during their installation. (See Figure 1.)

The Retrocommissioning Sensor Suitcase is targeted for use in small commercial buildings of less than 50,000 square feet of floor space that regularly receive basic services such as maintenance and repair, but don’t have in-house energy management staff or buildings experts. The hardware kit is designed to be easy-to-use by building maintenance staff, or other professionals such as telecom and alarm technicians. The sensors in the suitcase include those for lighting, vibration (for measuring the condition of rooftop units), and various types of temperature sensors for internal and external areas of the building. (See Figure 2 – sensor platform prototype.)

In addition to the hardware kit, the turnkey comprises a software application to collect, process, store, and analyze the data. The kit’s user, or a third party such as an energy performance contractor can use this software to generate specific recommendations on what actions to take to reduce the building’s energy cost, and improve comfort.

“The Suitcase’s user would walk through the building, installing the sensors based on guidance from the hand-held,” says Granderson. “Simple instructions make it easy for the user to configure the sensors and document their type, and location using the smart pad.”

After a month or so of automatic data collection, the user returns and collects the sensors, plugging them into sockets in the suitcase to download their data. (See figure 3.) Entering other basic information into the suitcase’s computer, like energy consumption and costs from the building’s electricity bill, allows the software to generate recommendations on how to improve the building’s performance, and how much energy could be saved by each measure. It’s then up to the building’s owner or operator to decide which measures to implement.

Status of Development

“Where we are now is that the proof of concept prototype is complete. We’re entering into a second phase of work to test the prototype in the field, and improve it based on what we learn.” says Granderson. The development team plans to make the hardware and software design available in the public domain, for transfer of the technology to partners who will move it into the marketplace.

This research is funded by the Department of Energy’s Office of Energy Efficiency and Renewable Energy.

—Allan Chen

 

View the figures here:  http://eetd.lbl.gov/news/article/57154/the-retrocommissioning-sensor-s

Estimating Policy-Driven Greenhouse Gas Emissions Trajectories in California: The California Greenhouse Gas Inventory Spreadsheet (GHGIS) Model

 Posted by Allan on November 4th, 2013

For decades, California has used groundbreaking tools to collect and analyze emissions data from a variety of sources to establish a scientific basis for policy making. As its scope has expanded to include greenhouse gas (GHG) reductions, it has sought out similar tools to use to achieve the goals of legislation such as the Global Warming Solutions Act of 2006 (AB 32).

To support this effort, Lawrence Berkeley National Laboratory developed a California Greenhouse Gas Inventory Spreadsheet (GHGIS) model funded by the California Air Resources Board (ARB), to explore the impact of combinations of state policies on state GHG and regional criteria pollutant emissions. This Excel-based model includes representations of all the GHG-emitting sectors of the California economy, and it was calibrated using data and projections from state agencies and other sources. The model also calculated emissions of three criteria pollutants—reactive organic gases, nitrogen oxides (NOx), and fine particulates—statewide and for the South Coast Air Basin (SCAB) and the San Joaquin Valley (SJV).

The project modeled three scenarios: (1) all committed policies, (2) additional, uncommitted policy targets, and (3) potential technology and market futures.

Results indicate that for all three scenarios California would be able to meet the 2020 statewide GHG targets, and by 2030, a plausible range of statewide GHG emissions falls between 208 and 396 megatons of carbon dioxide equivalent per year (MtCO2e/yr). However, the modeling revealed that the state will be unable to meet the 2050 GHG target of 85 MtCO2e/yr under any of the scenarios, so additional policies will need to be developed to meet this stringent target. In all three scenarios, results showed that criteria pollutants were significantly reduced statewide and in the two regional air basins; however, they may not meet future federal standards. In particular, NOx emissions were significantly above the estimated targets.

More Information

Jeffery Greenblatt, (415) 814-9088, JBGreenblatt@lbl.gov

Greenblatt, Jeffery B. 2013. Estimating Policy-Driven Greenhouse Gas Emissions Trajectories in California: The California Greenhouse Gas Inventory Spreadsheet (GHGIS) Model. http://eetd.lbl.gov/publications/estimating-policy-driven-greenhouse-g

 

Energy efficiency of small data centers

 Posted by Allan on October 23rd, 2013

Fifty-seven percent of U.S. servers are housed in server closets, server rooms, and localized data centers, in what are commonly referred to as small server rooms, which comprise 99.3 percent of all server spaces in the U.S. While many mid-tier and enterprise-class data centers are owned by large corporations that consider energy efficiency a goal to minimize business operating costs, small server rooms typically are not similarly motivated. They are characterized by decentralized ownership and management, and come in many configurations, which creates a unique set of efficiency challenges.

To develop energy efficiency strategies for these spaces, we surveyed 30 small server rooms across eight institutions, and then selected four of them for detailed assessments including some power measurements. This revealed Power Usage Effectiveness (PUE) values ranging from 1.5 to 2.1. Energy saving opportunities ranged from no to low-cost measures such as raising cooling set points and better airflow management, to more involved but cost-effective measures including server consolidation and virtualization, and dedicated cooling with economizers.

We found that inefficiencies mainly resulted from organizational rather than technical issues. Because of the inherent space and resource limitation, the most effective measure is to operate servers through energy-efficient cloud-based services or well-managed larger data centers rather than server rooms. Otherwise, backup power requirement, and IT and cooling efficiency should be evaluated to minimize energy waste in the server space.  Utility programs are instrumental in raising awareness and spreading technical knowledge on server operation, and the implementation of energy efficiency measures in small server rooms.

—Iris Cheung

HYCheung@lbl.gov

Study estimates energy savings from bringing all U.S. homes up to code

 Posted by Allan on October 21st, 2013

Tightening the envelope of homes should save energy by reducing the loss of heat when the exterior is cold, and the loss of cooled air when it is hot. Until now, it has not been clear for the current housing stock how much potential for energy savings exists in the U.S. stock of homes if all were brought up to codes requiring the tightening of the building envelope.

A research team in the Residential Building Systems group including Jennifer Logue, Brett Singer, Iain Walker and Max Sherman has now estimated the potential energy savings of implementing airtightness improvements or absolute standards, as well as adding mechanical ventilation throughout the U.S. housing stock. Logue, Singer, Walker and Sherman are scientists in the Lawrence Berkeley National Laboratory’s (Berkeley Lab) Environmental Energy Technology (EETD).

The study predicts that tightening the envelope of all U.S. homes to the current average level of performance after an energy performance retrofit would reduce the residential sector’s site energy demand by 0.72 quadrillion BTUs (0.76 exajoules) annually, or more than 10 percent of the energy required to heat and cools homes. But the study also points to differences in the relative savings of homes depending on what climate zone they are located in, and other factors such as building type. It provides information that can help improve the success of energy efficiency programs designed to save energy through weatherization and air sealing of homes.

Codes require adding mechanical ventilation to well sealed buildings to ensure that buildings have adequate air circulation, but this has the effect of slightly increasing the overall energy use of buildings. ASHRAE 62.2 required whole house mechanical ventilation in order to provide sufficient fresh air for occupant health. The standard also allows for credits that reduce the required mechanical ventilation if the home is sufficiently leaky (i.e. natural airflow through the home through cracks is sufficient to provide ventilation) The study found that, for the existing housing stock, adding ventilation to homes that are not leaky enough to provide sufficient ventilation would only increase annual site energy use in the stock by 0.05-0.07 quads.

The U.S. has about 113 million homes, and in total, they represent about 23 percent of energy use. Heating and cooling these homes requires about 5 quadrillion BTUs, or half of the residential sector’s total energy use. Reducing the waste of heating and cooling energy could save quite a bit of energy and money, and various voluntary building codes, such as IECC building tightness standard, Canada’s R2000 Standard and the Passive House Standard, are designed to provide guidance on expectable levels of building leakiness.

 

Uses for policymakers

The results of the research are useful to policymakers—it points the way to optimizing the effectiveness of programs for improving the energy performance of homes and reducing energy costs by tailoring a program’s approaches to factors such as the climate zone, household income, and a home’s physical parameters. “It gives policymakers a wedge to improve programs by comparing incremental benefits of increasing air sealing effectiveness (or reaching more stringent air tightness targets) against the costs of achieving these higher levels of home performance” says Logue.

The research team developed a computer model to calculate the change in energy demand for each home in a national sample of more than 50,000 virtual homes developed using data from the Department of Energy’s 2009 Residential Energy Consumption Survey. The sample’s virtual homes model a variety of home types found in each climate zone of each state in the U.S. According to the RECS, the U.S. housing stock contains 63.2% detached houses, 24.8% multi-family homes, 5.9% attached homes, and 6.1% mobile homes.

Using this cohort of U.S. homes, they estimated the energy impact of tightening building envelopes and adding mechanical ventilation for a typical meteorological year. The research uses the ASHRAE 62.2 standard as the basis for adding ventilation to homes—all homes were assumed to have been provided with specified required levels of ventilation. Once all homes complied with ASHRAE 62.2, the research team studied the energy saving impacts of upgrading to a variety of different existing national and international standards on building envelope tightness.

In the first scenario, the study calculates how much energy would be required to bring all homes in the U.S. to only the ASHRAE 62.2 standard, providing sufficient ventilation for good indoor air quality. Four other scenarios provide estimates of energy savings resulting from bringing U.S. homes up to the 62.2 standard plus a tightness standard or constraint. For example, another scenario examines the result of improving the envelope airtightness of all homes at levels currently achieved by The Department of Energy’s Weatherization Assistance Program (WAP) and non-WAP energy efficiency programs while complying with ASHRAE 62.2. The WAP is DOE’s program aimed at improving the energy efficiency of low-income homeowners, and is significant as one of the largest federal programs to improve home energy performance in the U.S.

Another “advanced” scenario assumes the tightening of envelopes as necessary to ensure that each house reaches the current 90th percentile tightness for homes with similar key characteristics while complying with ASHRAE 62.2.

Other scenarios looked at the energy savings from a combination of ASHRAE 62.2 and three major international standards: IECC 2012 (International Energy Conservation Code), Canada’s R2000, and the Passive House standard

“Our study found that the annual energy impact of bringing the entire current stock of homes into compliance with ASHRAE 62.2 is relatively small; it would increase the annual site energy demand of the residential sector by less than one percent,” says Logue.

The research also predicts that the average tightening would reduce the residential sector’s site energy demand by 0.72 quadrillion BTUs (0.76 exajoules) annually, compared to the roughly 5 quadrillion BTUs annual energy use for heating and cooling all U.S. homes.

“We also estimated that using advanced methods of air sealing to get all homes to the of the tightest 10% would double the energy savings of tightening at current average improvement levels—representing about $22 billion in savings,” she adds.

The researchers found that increasing the effectiveness of Weatherization Assistance Program and non-WAP retrofits to ensure that all homes reach 90th percentile air-tightness levels for homes of similar age and construction could double the energy impact of air sealing in these programs.

The value of this study to managers of energy efficiency programs at states, municipalities and utilities, Logue believes, is that it provides a considerable amount of data about how much savings can result from tightening the envelopes of homes by climate zone, and home characteristics. It provides tools to weigh the costs of these programs vs. their benefits according to such factors as where the homes are located, and what level of tightening is achieved.

This research was funded by the Department of Energy’s Office of Energy Efficiency and Renewable Energy.

The paper, “Energy impacts of envelope tightening and mechanical ventilation for the U.S. residential sector,” was written by Jennifer Logue, Max Sherman, Iain Walker, and Brett Singer of the Residential Building Systems Group.

http://www.sciencedirect.com/science/article/pii/S0378778813003460

Download the research as an LBNL report: http://homes.lbl.gov/sites/all/files/lbnl-6053e.pdf

For more information about the Residential Buildings Group of the Environmental Energy Technologies Division:

http://homes.lbl.gov/

Measuring Miscellaneous Electrical Loads in Buildings

 Posted by Allan on October 1st, 2013

‘Other’ is the fastest growing energy use in residential and commercial buildings—devices such as computers, displays, printers and other office equipment, as well as small household appliances ranging from kitchen electrics like coffee makers, toasters, and mixers to fans, clocks and portable space heaters. To energy researchers these devices are known as miscellaneous and electronic loads (MELs), and about one-third of end-use electricity consumption in homes and commercial buildings is attributed to them.

Hundreds of devices fall under the MELs category, and their energy use is not well-understood, in part because of their great variety, and because the sensing technology to measure these individual loads is expensive ($200 to $300 per metering point) and cumbersome to install.

Researchers in the Environmental Energy Technologies Division (EETD) of Lawrence Berkeley National Laboratory (Berkeley Lab) have teamed with the University of California, Berkeley to study MELs, to better characterize the variety of MELs and estimate their load growth, and to develop better inexpensive technologies for monitoring them. EETD’s Steven Lanzisera and Rich Brown have been leading the effort to decipher MELs impacts on buildings

Understanding and reducing the energy use of MELs is a significant problem. The buildings industry is working towards designing, building, and operating very low to net-zero energy using buildings, and the load growth of MELs is pulling energy use in the wrong direction. Knowing which devices use the most energy, and what their load profiles look like—that is, how their energy use fluctuates throughout the day—will help researchers, and ultimately, manufacturers, building operators and homeowners, understand how to better manage the energy use of MELs.

Read the full-length story here: http://eetd.lbl.gov/news/article/56996/measuring-miscellaneous-electri

This research was conducted by Steven Lanzisera, H.Y. Iris Cheung, Judy Lai, Xiaofan Zhang, and Richard Brown, at Berkeley Lab’s Environmental Energy Technologis Division, and Stephen Dawson-Haggerty, Jay Taneja, Jorge Ortz, and David Culler, of the University of California, Berkeley.

S. Lanzisera, S. Dawson-Haggerty, H.Y. Cheung, J. Taneja, D. Culler, R. Brown, “Methods for Detailed Energy Data Collection of Miscellaneous and Electronic Loads in a Commercial Office Building” Building and Environment, DOI: 10.1016/j.buildenv.2013.03.025, March 2013.

This research was funded by the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy.

http://plug-in.lbl.gov

Large-scale solar projects in the United States have made great progress in delivering competitively priced renewable electricity

 Posted by Allan on October 1st, 2013

Mark Bolinger (technical contact): (603) 795-4937, MABolinger@lbl.gov

Berkeley, CA — The price at which electricity from large-scale solar power projects in the western United States is being sold has fallen by more than two-thirds in the last five years, according to a new report released today by Lawrence Berkeley National Laboratory.  The report – “Utility-Scale Solar 2012: An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States” – focuses exclusively on ground-mounted solar projects with capacity ratings greater than 2 MWAC.  Such large-scale solar projects are commonly referred to as “utility-scale” solar.

The progress revealed by the report comes despite the utility-scale segment of the market being relatively young.  “Although a number of concentrating solar power (“CSP”) projects using parabolic trough reflectors have been operating in California since the 1980’s, very few CSP projects have been built since then, and utility-scale photovoltaic (“PV”) projects have really only entered the market in the last five to seven years” notes Berkeley Lab report author Mark Bolinger.  In that short time, utility-scale PV has rapidly grown to become, for the first time in 2012, the largest segment of the overall PV market in the United States – a position it is expected to retain through at least 2016.  Utility-scale PV has also far surpassed CSP as the dominant utility-scale solar technology, though a number of major new CSP projects – including two large power tower facilities and three large parabolic trough projects – are under construction and should reach commercial operations in the coming months.

As a result of this rapid growth, there is now a critical mass of utility-scale project-level data ripe for analysis.  This report – the first edition in a new Berkeley Lab series to be revisited annually – analyzes project-level empirical data in four key areas:  installed project costs or prices, operating costs, capacity factors, and power purchase agreement (“PPA”) prices.  “With the growth in the market in recent years, we are now able to systematically review actual market data to directly observe what large-scale solar projects cost to build and operate, how they are performing, and at what price they are selling electricity,” notes report co-author Samantha Weaver.  Data for some of these areas – operating costs in particular – are still sparse, but availability should improve greatly in the coming years as the strong pipeline of utility-scale solar projects – for both PV and CSP – continues to be built out.

Key findings in each of the four areas covered by the report include the following:

Installed Project Prices:  Installed PV project prices have fallen by nearly one-third since the 2007-2009 period, from around $5.6/WAC to $3.9/WAC (in real 2012 dollars) on average for projects completed in 2012 (with some projects higher and others lower, and with further reductions evident in 2013).  Most of the decline has been concentrated among projects using crystalline silicon (“c-Si”) modules or panels, as the cost gap between c-Si and thin-film modules steadily eroded over this period.  In response to falling c-Si module prices, there has been a marked increase in the proportion of projects using c-Si (rather than thin-film) modules.

O&M Costs:  Although publicly available O&M cost data are extremely limited at present, the data suggest that actual costs to date have largely been in line with pro forma operating cost projections.  For PV, O&M costs appear to be in the neighborhood of $10-$20/MWh.  For CSP (parabolic trough, no storage), O&M costs are higher due to the thermal components, and come in around $25-$30/MWh.

Capacity Factors:  PV capacity factors vary by region, by module type (c-Si versus thin film), and by whether a project is installed at a fixed tilt or uses a tracking device.  In some of the best locations in the Southwest, PV projects using single-axis trackers are able to achieve capacity factors in excess of 30%.  In lieu of trackers, and enabled by the sharp decline in module prices, some projects have instead opted to oversize the PV array relative to the capacity rating of the inverters as a way to boost production during morning and evening hours (and thus increase overall capacity factor in AC terms).  As expected due to their ability to perform well at high temperatures, thin-film projects appear to have higher capacity factors than c-Si projects in the Southwest, where ambient temperatures are highest, but not elsewhere.  On the CSP side of the market, parabolic trough systems that have been operating in California for more than 20 years were still (in 2012) achieving capacity factors in excess of 20% (solar output only, with no storage and not counting fossil fuel augmentation), which is comparable to newer trough projects without storage.

Power Purchase Agreement (“PPA”) Prices:  Driven primarily by lower installed PV project prices (which, in turn, have been driven primarily by declining module prices), as well as expectations for further cost reductions in future years, levelized PPA prices have fallen by more than two-thirds since 2008, or by roughly $25/MWh per year on average.  Some of the most-recent PPAs for PV projects in the West have levelized PPA prices as low as $50-60/MWh (in real 2012 dollars), which, in some cases, is competitive with wind power projects in that same region.  PV appears to be particularly competitive when considering its time-of-delivery pricing advantage (i.e., the fact that solar generation correlates better with peak demand periods), which amounts to roughly $20/MWh (in California and at current levels of penetration) relative to a flat block of base-load power, and roughly $25/MWh relative to wind power.

Meanwhile, the full report, along with a presentation slide deck that summarizes the report, can be downloaded for free from http://emp.lbl.gov/reports/re

This research was supported by funding from the U.S. Department of Energy’s SunShot Initiative.

Berkeley Lab report finds U.S. energy service companies experienced steady growth despite recession

 Posted by Allan on September 25th, 2013

Media Contact: Allan Chen (510) 486-4210

Technical Contact: Charles Goldman (510) 486-4637| CAGoldman@lbl.gov

 

Berkeley Lab report finds U.S. energy service companies experienced steady growth despite recession

Industry could more than double in size from ~$6 billion in 2013 to $11-$15 billion by 2020

Aggregate revenue growth rates for U.S. energy service companies (ESCOs) significantly outpaced U.S. GDP growth during the three-year period 2009 to 2011, according to a new report by researchers at Lawrence Berkeley National Laboratory (Berkeley Lab).

ESCOs primarily use performance-based contracts to provide energy efficiency, renewable and other energy-related services while guaranteeing that installed equipment, controls and other measures will deliver a specified amount of cost and resource savings to the customer.  Private and public sector ESCO customers have relied on performance-based projects for decades to reduce their operating costs and reap significant energy, water, and other savings, using little or no upfront cash.   For context, this industry typically saves customers each year: (1) the equivalent amount of energy consumed by nearly 2 million households; (2) more than 20 million tons of greenhouse gas emissions; and (3) more than $4 billion in utility bills.

According to The President’s 2013 Climate Action Plan, performance-based contracts “drive economic development, utilize private sector innovation, and increase efficiency at minimum costs to the taxpayer, while also providing long-term savings in energy costs.” Performance contracting allows customers pay back the capital and financing costs of the efficiency improvements over time, out of the stream of dollar savings generated by performance-based projects, thus reducing the need to use tax dollars, or other appropriated funds to generate these savings.  Federal, state, and local policies that remove existing barriers and encourage the future use of performance-based contracts will continue to be vital for industry growth into the future.

The research team analyzed the size of the U.S. ESCO industry by market segment, as well as growth projections and trends.  Researchers collected information from thirty-five ESCOs, publicly-available data on ESCO financial performance, and industry experts.

“The ESCO industry has experienced fairly steady growth since the 1990s, and despite the recession, continued to grow about 9 percent per year from 2009 to 2011,” said Elizabeth Stuart, a researcher in Berkeley Lab’s Electricity Markets and Policy (EMP) Group in the Environmental Energy Technologies Division (EETD) and lead author of the report.

“We anticipate that U.S. ESCO industry revenues could double in size between today and 2020,” said co-author Peter Larsen, an economist at Berkeley Lab. “Based on historical trends, it is possible that the industry could grow 8 to12 percent annually depending on a number of scenarios?potentially achieving revenues of more than $15 billion in 2020. There are a number of factors that could impact the industry’s ability to achieve this expected growth, including clean energy and infrastructure modernization policies as well as expansion of ESCO services that take advantage of emerging opportunities.”

ESCOs provided estimates of the total building floor area in each customer segment that had received performance-based energy efficiency retrofits since 2003. Market penetration was highest in the K-12 schools market (42 percent penetration) and lowest in the private commercial buildings sector, where about 9 percent of eligible building space was estimated to have received retrofits since 2003.

The research team also estimated the remaining market potential for ESCOs.  “If ESCOs were able to retrofit the remaining floor space, the investment potential in facilities typically addressed by the ESCO industry ranges from about $71 to $133 billion,” said co-author Charles Goldman, Department Head at Berkeley Lab. “The private commercial sector, K-12 schools and healthcare facilities are the markets with the largest remaining investment potential.”

“There is still a significant market for ESCOs working in the government and universities market segments,” said report co-author Donald Gilligan, President of the National Association of Energy Service Companies. “ESCOs have a strong track record working in these markets – federal, state and local – and we expect clean energy policies to continue to drive demand for the services that ESCOs offer these customers.”

Other key findings from the report include:

  1. About 45 companies operating in the U.S. met our strict definition of an ESCO.
  2. Performance-based contracts made up about 70 percent of ESCOs’ business in 2011, while 15 percent of revenue came from non-performance-based projects, 7 percent from administering energy efficiency programs for utilities, and just under four percent each for consulting and renewable power purchase agreements.
  3. Public and institutional markets (federal, state and local governments, K-12 schools, healthcare/hospital facilities, and colleges and universities), continue to be ESCOs’ primary customers, accounting for about 84 percent of 2011 industry revenue.  About 8 percent of 2011 revenues came from private commercial customers.
  4. ESCOs reported a significant decline in revenue from renewable generation projects since 2008, both in terms of percent of total revenues (from about 15 percent in 2008 to 6 percent in 2011) and absolute dollar amounts (from about $560 million in 2008 to $250 million in 2011).
  5. The U.S. ESCO industry is similar in size to industries in Germany and France (about $4 to $5 billion), and China (about $4 to $7 billion in 2012), though definitions of ESCOs and revenue reporting practices vary across countries.
  6. Small ESCOs reported that about 15% of their projects relied on funds from some type of federal program since 2009. Medium and large ESCOs reported that about 30% of their projects relied on federal programs.

The full report, “Current Size and Remaining Market Potential of the U.S. Energy Service Company Industry” is available for download here.

The work was funded by the Department of Energy’s Office of Weatherization and Intergovernmental Programs (OWIP) within the Office of Energy Efficiency and Renewable Energy (EERE).  More information about Berkeley Lab’s research on the ESCO industry may be found here.

 

CalCharge and Berkeley Lab Sign Groundbreaking CRADA

 Posted by Allan on September 18th, 2013

September Special Focus: CalCharge

In preparation for its mid-October launch, CalCharge and Lawrence Berkeley National Laboratory (Berkeley Lab) have completed work on their innovative Cooperative Research and Development Agreement (CRADA) that is expected to become a national model for industry engagement with the national laboratories.

Energy storage innovators need cost-effective and clear pathways to access the expertise, facilities, and resources needed to accelerate the development and commercialization of their technologies.  Through its Technology Assessment and Acceleration programs, CalCharge is forging a series of strategic relationships that will better leverage existing resources and dramatically expand access to the array of world-class research and testing facilities found in the national labs, universities, and other organizations in California.

Working with Berkeley Lab, CalCharge has finalized the first of these signature relationships.  Through an extremely innovative and dramatically streamlined Cooperative Research and Development Agreement (CRADA), member companies will be able to access services, facilities and personnel at Berkeley Lab significantly faster, and at a lower expected cost, than through traditional bilaterally negotiated contracts.

Through this best-in-class CRADA, CalCharge members will be able to:

  • Develop and commence a collaborative research project in a fraction of the time required to negotiate an individual company CRADA.
  • Design smaller scale projects than would be cost effective through an individual company CRADA.
  • Divide larger projects into tiered stages that can be conducted in rapid sequence.
  • Designate intellectual property generated during the project as Protected CRADA information and prevent its public disclosure and publication.
  • Obtain an exclusive license and/or title to any subject inventions developed during the cooperative research project.

For more information on CalCharge and to request membership information when it becomes available, please email membership@calcharge.org

CalCharge Prepares for Launch

 Posted by Allan on September 16th, 2013

After nearly two years of research and organizational development efforts, Lawrence Berkeley National Laboratory (Berkeley Lab), CalCEF, and other key partners are only weeks away from the full operational launch of CalCharge. By mid-October, they expect to announce the core group of institutional and major corporate members along with the first tranche of emerging company members.

California is home to more than 95 battery and electrochemical storage (energy storage) companies, ranging from startups to global corporations focused on the technologies and components needed for the transportation, grid, and consumer electronic markets. As one of the largest and most dynamic concentrations of energy storage companies in the US, the California cluster has an outsized impact on industry and market development globally. However, as with any emerging industry, there are critical gaps in the existing ecosystem that are impairing its development.

To identify and develop solutions to these challenges Berkeley Lab and CalCEF conducted extensive research from 2011 to 2013. This included not only general market analysis, but also direct engagement and feedback from a broad cross-section of business, public, and civil society stakeholders from across the California energy storage cluster. The result—CalCharge.

CalCharge is a groundbreaking public-private partnership working to accelerate the development, commercialization, and adoption of new energy storage technologies for the consumer, transportation, and grid markets. It will bring together emerging and established companies, academic and research institutions, government agencies, and other key stakeholders to increase the growth of the energy storage sector and spur the creation of advanced manufacturing capacity and processes. Through CalCharge, members will have access to Technology Assessment and Acceleration, Professional Development, Pre-Commercialization Support, and Ecosystem Facilitation programs. The goal—a thriving California energy storage cluster that is a key driver of industry and market growth globally.

For more information on CalCharge and to request membership information when it becomes available, please email membership@calcharge.org

Daylighting Window Film Shows Potential to Significantly Reduce Lighting Energy Use in Buildings

 Posted by Allan on September 4th, 2013

Daylighting is the strategy of admitting light from the sun and sky to reduce use of electric lighting in buildings. Since lighting energy use represents 13 percent of the total primary energy used by buildings in the United States or 5.42 quadrillion Btus in 2010, these technologies can play a significant role towards meeting U.S. and state energy-efficiency and greenhouse gas emission reduction goals. Conventional windows cannot provide useful daylight beyond about one to one and a half times the head height of a window because interior shades, when lowered to control direct sun and glare, diminish daylight penetration. Daylight technologies counter this problem, increasing illuminance deeper in the room from vertical clerestory windows by redirecting sunlight (and diffuse light) towards the ceiling plane. Lack of performance data has severely limited the uptake of these technologies into the marketplace and slowed innovation.  Architects, engineers, and building owners are typically unwilling to take the risk of adopting emerging technologies without clear evidence that they perform well.

Windows test facility with 3M

Berkeley Lab’s Windows Testbed Facility.

Lawrence Berkeley National Laboratory (Berkeley Lab) has been collaborating with the window industry to develop and evaluate innovative daylighting technologies that can reduce lighting energy use by as much as 50 percent up to 40 feet from windows. Since lighting is often the single largest energy use in commercial buildings, these technologies could make a significant contribution to reducing the nation’s energy use. Researchers in Berkeley Lab’s Environmental Energy Technologies Division (EETD) are using simulation tools (Radiance, Window, EnergyPlus, COMFEN) and new measurement facilities to accurately assess where and how much solar radiation and daylight flux can be effectively controlled by innovative new optical materials and systems as they are redirected into the building’s interior. Using these tools, calculations of energy use and visual discomfort can be done more accurately and in a fraction of the time needed in the past. As a result, industry partners can now determine how well new optical designs will work long before they invest a lot of time and resources into prototype fabrication and testing in the field.

“There’s a large potential to speed up the time to market and reduce the cost of development of new energy-efficient technologies through the use of these simulation tools,” says Andrew McNeil, Senior Scientific Engineering Associate in EETD.

An example of the benefits of this approach is a collaboration Berkeley Lab has developed with the 3M Corporation. 3M developed a microstructured prismatic film consisting of linear multi-sided prisms 50 to 250 micrometers high. Results from simulation analysis indicate that a small clerestory window with the 3M dual-film system and a lower window with conventional shades can daylight a 40-feet deep perimeter zone facing south, east, or west in virtually all U.S. climates and save up to 40 percent of annual lighting energy compared to the same zone with no daylighting controls.  EETD researchers corroborated these findings with measured data in Berkeley Lab’s Advanced Windows Testbed facility. 3M has initiated partnerships with window manufacturers to incorporate their new film in new and retrofit applications in commercial and residential buildings.

“We are very excited to have been able to collaborate with Berkeley Lab over these past three years. When we launch our product in Fall 2013, we hope to see accelerated adoption of our products since we will be able to explain the energy-efficiency and comfort impacts of this innovation with confidence to potential customers,” said Raghunath Padiyath, Lead Product Development Specialist of the 3M Renewable Energy Division.

Because calculation speed was vastly increased with the new software, McNeil was able to derive a more optimized single-film design using the power of Berkeley Lab’s cluster computing farm of 128 parallel processors, genetic algorithms, and some pretty fancy coding. This design is expected to produce the same level of performance at lower cost, pending outcomes from field studies that are now in progress. 3M is evaluating this design for production.

The U.S. Department of Energy and the California Energy Commission through its Public Interest Energy Research (PIER) Program provided funding for both the development of the simulation tools and field tests conducted on behalf of industry. The simulation tools are available to the public at no cost and have been used to assist other U.S. and California-based manufacturers with new product developments. These clean technology investments are designed to create jobs and make businesses in U.S. and California more competitive worldwide, help achieve aggressive federal and state energy and greenhouse gas reduction goals, which benefit consumers and businesses through lower utility bills.

This study was conducted by: Andrew McNeil, Jacob Jonsson, Anothai Thanachareonkit, and Principal Investigator, Eleanor Lee (Berkeley Lab) in collaboration with Raghunath Padiyath, Doug Huntley, (3M Renewable Energy Division), Bing Hao (3M Corporate Research Materials Laboratory) with support from the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy and the California Energy Commission through its Public Interest Energy Research (PIER) Program.

 

Notes

Site lighting energy use with a small clerestory aperture (WWR=0.18) over a 40-ft deep perimeter zone facing south, east, or west in northern and southern US climates.  Occurrence of discomfort glare is less than 5% of annual occupied hours.  Simple payback is 5 years, the IRR is 19%, and CCE is $0.08/kWh, assuming an installed cost of $20/ft2, $0.20/kWh, 30 year life, and 6% discount rate.

 

Reports

3M performance

  • A. Thanachareonkit, E.S. Lee, A. McNeil, Empirical assessment of a prismatic daylight-redirecting window film in a full-scale office testbed, Accepted for presentation to the IESNA 2013 Annual Conference, Huntington Beach, CA, October 26-29, 2013 and for publication in Leukos, the journal of the IESNA. http://eetd.lbl.gov/daylight/daylight-field-test.pdf
  • A. McNeil, E.S. Lee, J.C. Jonsson, Daylight performance of a microstructured prismatic window film in deep plan open plan offices, to be submitted to Solar Energy.  Summary. http://eetd.lbl.gov/daylight/daylight-summary.pdf

Simulation Tools

 

Bringing Energy Efficiency to High Performance Computing

 Posted by Allan on September 3rd, 2013

The ability of high performance computers (HPCs) to solve complex applications very quickly has risen exponentially since their introduction in the 1960s; unfortunately, so has their electricity use. Many supercomputers require more than a megawatt of electricity to operate, and annual electricity costs can easily run into millions of dollars. As the use of HPCs became more widespread, researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) saw the need to improve energy efficiency of supercomputers and the infrastructures that support them.

Berkeley Lab researchers organized the Energy Efficient High Performance Computing Working Group (EE HPC WG) in 2008 to promote energy-efficient computing best practices and to drive improvements in energy performance of HPC systems. At the time, the concept of energy-efficient computing was often a distant afterthought in the race to improve supercomputer computational performance as quickly as possible.

“We were convinced that bringing U.S. Department of Energy (DOE) national laboratories together to demand more efficient supercomputers would bring the issue to the forefront for supercomputer developers and vendors,” says Bill Tschudi, leader of the High Tech and Industrial Systems Group at Berkeley Lab. “As a significant segment of the HPC market, the national labs were interested not only in spurring more efficient designs and equipment, but also in reducing their own energy bills—costs that were siphoning money from their mission.”

tschudi-wf

William Tschudi

The DOE’s Federal Energy Management Program (FEMP) provided funding for Berkeley Lab to start the group, which hoped to serve both as a united front to promote energy efficient computing and as a forum for sharing best practices. The strategy worked. Awareness of the need for energy efficient HPC grew, which sparked competition among vendors to improve HPC energy efficiency even before end users asked for it. Today, realizing the benefits of energy-efficient HPCs, end users are putting requirements in proposal requests, and vendors are not only responding to those requests, but are also participating in many of the EE HPC working group’s activities.

Today, Berkeley Lab continues to provide ideas and lead the working group, which is now supported by the DOE Sustainability Performance Office. The group’s members—over 380 of them from 20 countries—participate voluntarily and self-select topics of interest to the group. Members include representatives from other federal agencies, universities, private industry, and vendors of HPC and data center equipment, including prominent companies such as Intel, Emerson, IBM, Cray, and others.

Grassroots Collaboration

The EE HPC WG consists of three subgroups: one focused on infrastructure, another on systems, and a third on outreach and conferences.

  • Dale Sartor, from Berkeley Lab, and Natalie Bates, a Berkeley Lab sub-contractor, co-lead the Working Group and oversee the working group’s general activities.
  • Bill Tschudi, of Berkeley Lab, and David Martinez of Sandia National Laboratory, co-lead the Infrastructure Subgroup.
  • Berkeley Lab’s John Shalf and Erich Strohmaier co-lead the Computing Systems Subgroup.
  • Lawrence Livermore National Laboratory’s Anna Maria Bailey and Marriann Silveira co-lead the Conferences Subgroup.

Within each of those subgroups, small teams are formed to address specific issues. Examples are the HPL Power/Energy Measurement Methodology team, Liquid Cooling Commissioning team, and HPC Demand Response team, which all meet (virtually) multiple times each month.

“Working group members participate in whichever sub-group project interests them and can benefit from their expertise,” explains Tschudi. “Once the subgroup members have made progress on their issue, they present it to the larger group for feedback and further development.” Once completed, the groups disband and typically select other topics of interest.

Members drive the working group’s agenda, which ensures that the projects meet the membership’s most pressing needs. A March 2013 member survey showed that members identified 12 out of the 14 activities currently being conducted by the group as high-value activities. In that same survey, more than half of the members identified “improving software to tune for energy efficiency” as an activity to pursue in the future.

The working group as a whole meets (virtually) bi-monthly, but occasionally meets in person at supercomputer conferences such as the SC Conference, International Supercomputing Conference (ISC), and others. Members of the group also present papers at SC and ISC and arrange annual “Birds of Feather” sessions (informal meetings) to discuss recent developments in the field.

Moving Toward Common Approaches

In its nearly five years of existence, some of the working group’s most important achievements have been in developing common metrics, measurement protocols, and guidelines for the supercomputer industry: for liquid cooling of supercomputers, for determining power usage effectiveness, and for measuring power during computational output.

Development of Guidelines for Liquid Cooling of Supercomputers

When vendors began producing liquid cooling systems, no standard thermal guidelines existed. By evaluating systems from the processor to the atmosphere, the EE HPC WG identified temperatures that could be supported, and developed a set of recommended temperatures that vendors could use to design equipment. The working group’s recommendations first appeared in an ASHRAE white paper, and are now in ASHRAE’s guidelines of recommended temperatures. Supercomputer vendors participated in this process throughout.

Development of New Metrics for Determining Power Usage Effectiveness

The Power Usage Effectiveness (PUE) metric has been used for years to determine how much of the power in a data center is consumed by the IT equipment (as opposed to other facility loads such as cooling and power distribution). However, this metric is not effective in determining the efficiency of computer equipment when the system’s cooling fans or power conversions are located outside of the computer itself. The working group developed two new metrics to help evaluate these situations: (1) ITUE (IT?power usage effectiveness), which is similar to PUE but focuses on energy use inside the computer equipment, and (2) TUE (total-power usage effectiveness), which combines PUE and ITUE to provide a ratio of total energy (that of both internal and external support equipment) as well as the specific energy used in the HPC. The TUE can be used to compare one HPC system to another. The metrics were demonstrated on Oak Ridge National Laboratory’s (ORNL’s) Jaguar supercomputer system, and the working group plans to seek acceptance of the TUE metric through industry groups such as the Green Grid industry association.

Development of a Standard Method to Measure Computational Output

Every year, a list known as the Green 500 ranks the 500 most efficient supercomputers. The list helps supercomputer users and vendors identify the most efficient systems; however, because the methods used to determine efficiency have not been uniformly performed, the current comparisons are not as accurate as they could be. The EE HPC working group is developing a standard method that all users can use to measure power uniformly.

Commissioning Liquid-Cooled Supercomputer Systems

Commissioning of liquid cooled supercomputers is a relatively new requirement. The working group decided to share best practices and develop a liquid cooling commissioning guideline to inform those that have not dealt with liquid cooled systems. The subgroup working on this task includes vendors that provide liquid-cooled supercomputers and infrastructure cooling equipment.

Sharing Knowledge and Expertise

Because the working group’s member base is so dispersed and varied, its website is a key tool for keeping members informed. Webinars inform members of the subgroups’ progress, and the presentations from these webinars are archived on the site, along with the working group’s published papers. To expand the reach of the working group’s expertise, a member from Lawrence Livermore National Laboratory tracks conferences and other meetings where members could speak about new developments and receive feedback on various topics.

An Award-Winning Accomplishment

In 2013, members of the EE HPC WG won the Gauss Award, sponsored by the German Gauss Center for Supercomputing, for their paper, “TUE, a New Energy-Efficiency Metric Applied at ORNL’s Jaguar.” The award is presented for the most outstanding paper in the field of scalable supercomputing at the ISC conference held annually in Germany. Intel’s Mike Patterson, the primary author, presented the paper at the conference. Bill Tschudi and Henry Coles of Berkeley Lab’s Environmental Energy Technologies Division (EETD) were contributing authors. The paper described the TUE energy metric, including a description of its trial use at ORNL’s scientific computing center.

Slide1

Oak Ridge National Laboratory’s Jaguar Supercomputer

Courtesy of Oak Ridge National Laboratory, U.S. Dept. of Energy

SC13 Workshop

The workgroup also shares its expertise through workshops. It will present the ‘Building’ Energy Efficient High Performance Computing Fourth Annual EE HPC WG Workshop at SC13 in Denver, Colorado, in November. This popular annual workshop will feature high-profile researchers discussing new developments in energy-efficient HPC from both the facilities and system perspectives; from architecture through design and implementation.

Helping to Meet EISA Goals

By sharing information and developing common approaches, the workgroup is helping to reduce HPC energy use. For example, the Energy Independence and Security Act of 2007 (EISA) requires the U.S. federal government to reduce energy intensity in all its facilities, including laboratories and industrial buildings, by 30 percent by 2015. The work done by Berkeley Lab and EE HPC WG volunteers is helping federal facilities measure and quantify energy savings, as well as helping vendors design energy efficient supercomputer equipment. The growth in computing energy use makes this goal a challenge; however, the EE HPC WG is dramatically improving energy performance from its business-as-usual trajectory.

“Interest in the EE HPC working group, continues to grow,” says Tschudi. “The original vision of what the group could accomplish continues to be fulfilled through collaboration with the best minds engaged in supercomputing. DOE’s leadership in encouraging and supporting this activity is providing energy savings and other benefits throughout DOE labs, as well as the industry at large.”

More Information

Bill Tschudi, (510) 495-2417, WFTschudi@lbl.gov

Energy Efficient High Performance Computing Working Group website:  http://eehpcwg.lbl.gov/home

‘Building’ Energy Efficient High Performance Computing Fourth Annual EE HPC WG Workshop:  http://eehpcwg.lbl.gov/conferences

Patterson, Michael K., Stephen W Poole, Chung-Hsing Hsu, Don Maxwell, William Tschudi, Henry Coles, David J Martinez, and Natalie Bates. 2013. “TUE, a New Energy-Efficiency Metric Applied at ORNL’s Jaguar.” http://eetd.lbl.gov/sites/all/files/isc13_tuepaper.pdf

No Evidence of Residential Property Value Impacts Near U.S. Wind Turbines, a New Berkeley Lab Study Finds

 Posted by Allan on August 27th, 2013

Technical contact: Ben Hoen (845) 758-1896, bhoen@lbl.gov

Lawrence Berkeley National Laboratory (Berkeley Lab) analyzed more than 50,000 home sales near 67 wind facilities in 27 counties across nine U.S. states, yet was unable to uncover any impacts to nearby home property values.

“This is the second of two major studies we have conducted on this topic [the first was published in 2009 – see below], and in both studies [using two different datasets] we find no statistical evidence that operating wind turbines have had any measureable impact on home sales prices,” says Ben Hoen, the lead author of the new report.

Hoen is a researcher in the Environmental Energy Technologies Division of Berkeley Lab.

The new study used a number of sophisticated techniques to control for other potential impacts on home prices, including collecting data that spanned well before the wind facilities’ development was announced to after they were constructed and operating. This allowed the researchers to control for any pre-existing differences in home sales prices across their sample and any changes that occurred due to the housing bubble.

This study, the most comprehensive to-date, builds on both the previous Berkeley Lab study as well a number of other academic and published U.S. studies, which also generally find no measureable impacts near operating turbines.

“Although there have been claims of significant property value impacts near operating wind turbines that regularly surface in the press or in local communities, strong evidence to support those claims has failed to materialize in all of the major U.S. studies conducted thus far”, says Hoen.  “Moreover, our findings comport with the large set of studies that have investigated other potentially similar disamenities, such as high voltage transmission lines, land fills, and noisy roads, which suggest that widespread impacts from wind turbines would be either relatively small or non-existent.”

The report was authored by Ben Hoen (Berkeley Lab), Jason P. Brown (formerly USDA now Federal Reserve Bank of Kansas City), Thomas Jackson (Texas A & M and Real Property Analytics), Ryan Wiser (Berkeley Lab), Mark Thayer (San Diego State University) and Peter Cappers (Berkeley Lab). The research was supported by the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy.

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science.  For more, visit www.lbl.gov.

Additional Information:

Download the new 2013 report  “A Spatial Hedonic Analysis of the Effects of Wind Energy Facilities on Surrounding Property Values in the United States”

Download the 2009 LBNL Report “The Impact of Wind Power Projects on Residential Property Values in the United States: A Multi-Site Hedonic Analysis”

More information about DOE’s Wind Program

For more information on the report, contact Ben Hoen (bhoen@lbl.gov, 845-758-1896), or Ryan Wiser (RHWiser@lbl.gov, 510-486-54

A Q&A with Cindy Regnier, Manager of the Facility for Low Energy eXperiments in Buildings (FLEXLAB)

 Posted by Allan on August 26th, 2013

August 2013 Special Focus: FLEXLAB

 The Facility for Low Energy eXperiments in Buildings (FLEXLAB) is designed to be a national focal point for developing, simulating and testing energy-efficient technologies and strategies for buildings. FLEXLAB users will conduct research and develop technologies at FLEXLAB on single components as well as whole-building integrated design and operation aimed at substantially lowering the energy use, and improving the comfort and performance of both new and existing buildings. FLEXLAB is a facility of Lawrence Berkeley National Laboratory’s Environmental Energy Technologies Division (EETD).

In the following Q&A, Cindy Regnier, FLEXLAB’s manager, discusses FLEXLAB’s capabilities, and how its users will be able to use the facility when it opens.

 

How is the construction of FLEXLAB going? When will it be ready for users?

Construction is going well. At this point, FLEXLAB is on time and on budget, and construction should be complete in early 2014, including the commissioning process. Following that, we will put the facility through a calibration process to determine testbed accuracies, begin testing the data acquisition system and gathering baseline data from its many sensors.

 

Who do you expect will be the primary users of FLEXLAB when it is completed? And what needs does FLEXLAB address for these users?

The diversity of users is broad—maybe broader than you think. FLEXLAB can address the energy efficiency needs of utilities, federal and state research programs, manufacturers, building owners and the AECO [architecture, engineering, construction and owner-operated] community.

Product manufacturers of almost any type of building product or service are a natural user group for FLEXLAB, which can help extend the impact and market potential of products by developing integrated design solutions—such as automated shading coupled with dimmable lighting systems—that validate performance (for example, visual comfort,) as well as energy savings.

FLEXLAB can also help where they’ve developed emerging technology whose performance isn’t yet recognized in industry—for example, code, or simulation tools—they need verified performance data and a means to extend results to the rest of industry.

We expect to work with the AECO community, too. The developer and AECO community is increasingly being asked to deliver guaranteed performance of building designs, whether for energy performance disclosure laws or for other energy efficiency-related purposes. The community currently only develops mockups for constructability, not verification of energy or comfort performance. Verification of a design’s energy and overall performance in FLEXLAB lowers risk for the construction of the facility, especially where there are unique combinations of low energy systems, or high-risk elements that might affect comfort and performance such as full height glazing.

AECO users will be able to specify and test innovative systems for their designs in one or more of FLEXLAB’s testbeds, and use feedback data from their operation to improve their designs. Building new energy-efficient buildings, or improving the energy performance of existing buildings in an investment portfolio enhances value. The AECO community will develop higher confidence in and reduce financial risk of new innovative design strategies with higher energy efficiency targets. This is a capability that can differentiate the truly innovative AECO firms in the marketplace.

Utilities need verified performance of emerging technologies to increase certainty on their impact on energy use, as well as R&D in emerging areas of energy reduction strategies to meet their energy efficiency programmatic goals, such as whole building integrated system performance.

How about the public sector?

For federal and state energy efficiency programs, R&D in FLEXLAB can help them determine the best technologies to reach aggressive energy savings goals, such as California’s goal of net zero energy buildings by 2030. To get there, whole building integrated solutions that optimize performance and are cost-effective are needed.

FLEXLAB is uniquely suited for integrated system development because its infrastructure allows users to measure the interactions between multiple systems.  In addition, FLEXLAB’s relationship with industry throughout our partnership program will allow for greater connections to demonstration and deployment opportunities, significantly increasing the impact and outreach of their R&D portfolio.

Policymakers and building code officials will find that they can utilize testing results from FLEXLAB to help guide the improvement of energy efficiency codes and standards for buildings. And of course, the buildings research community is interested in working with us to develop new building technologies, as well as building simulation tools.

 

What are some of the energy efficiency problems that FLEXLAB was designed to address?

One problem that is occupying many minds right now is successfully integrating HVAC, facades, shading, lighting systems and controls in a way that’s cost effective, and generates aggressive energy savings. FLEXLAB provides unique capabilities for testing in this area. The interior spaces are reconfigurable, so the user can create multiple zonal conditions, such as core and perimeter, for testing whole-building or zone energy savings.

Reconfigurable lighting systems allow you to test different lighting technologies and controls, and assess their impact on thermal loads and HVAC energy use, as well measure energy use and impact on visual comfort of the lighting itself. Through reconfigurable glazing and shading systems, the user can measure the impact of different glazing technologies on convection, thermal loads, energy use and comfort.

The HVAC systems are also fully reconfigurable—we can provide full airside or hydronic side heating and cooling.  Each testbed also has radiant in-slab tubing with topping slabs of varying thicknesses to test different thermal mass and control strategies.  Overall we can provide everything from an older 1970s era HVAC system to displacement ventilation, radiant panels and other efficient alternatives.

The ability to mockup older systems and facades is important to allow us to study cost effective energy saving retrofit strategies.

Give some examples of integrating controls with operation.

Integrating building load control with the grid is an area that’s ripe for new technological solutions. For example, what are the optimal electric vehicle charging strategies when coupled with building loads that can reduce peak demand on the grid?  FLEXLAB will have networked charging stations nearby, for testing performance under real conditions.

Automated facades coupled with daylight dimming are a major challenge for designers, because their controls strategies vary and performance can be uncertain, (such as motorized blinds, shades or electrochromic glass). The designer needs to optimize incoming sunlight for work surfaces, but minimize incoming solar heat gain, heat loss during the winter, and glare. FLEXLAB can provide quantified strategies for these controls scenarios.

In FLEXLAB, the user can control and measure the performance of every design element and operational strategy—room configuration and occupancy, type of shading system, automated (or manual) control strategy. One-and two-story testbeds will be available, along with a rotating testbed that be used to position the testbed to different orientations with respect to the sun.  The two-story testbed will allow users to conduct skylight and clerestory studies, as well as tests that concern stacked floor conditions.

A FLEXLAB user can try different design and control strategies, test the performance of each, determine which system meets performance requirements the best, and improve on that system with further redesign and testing, which can be done with the testbed occupied or unoccupied.

 

EETD researchers have had a lot of prior experience researching these issues, correct?

Yes. Scientists here [in the Environmental Energy Technologies Division of Berkeley Lab] have conducted years of research addressing daylighting and automated control solutions. We worked with the New York Times Co. to help them develop an automated shading and daylight dimming system for their new headquarters building, testing potential technologies in a testbed we helped them develop in New York.

We’ll apply our years of experience in daylighting, demand response, automated controls and sensors, lighting systems, and other areas of building science to help FLEXLAB users design and execute tests that will help them solve their unique problems.

 

What makes FLEXLAB unique among building test facilities?

FLEXLAB is unique in having the ability to address the performance and optimization of integrated systems and technologies in buildings. No other facility can do this. Other existing testing facilities tend to focus on R&D around a specific technology. This limits their ability to address deeper energy saving opportunities that arise from integrating building systems to work together for maximum energy efficiency.

Savings from integrated design and operation will ultimately push buildings to net-zero energy territory. The additive savings from individual energy-efficient technologies just won’t achieve this level of performance.

Also, at FLEXLAB, we can look at other aspects of high-performance systems beyond energy efficiency, including thermal and visual comfort, and indoor environmental quality.

 

How can interested potential users learn more about FLEXLAB?

They can look at our website, FLEXLAB.lbl.gov, and they can email flexlab.info@lbl.gov to be put in touch with someone from the FLEXLAB team.

An award-winning technology that can boost the capacity of rechargeable lithium-ion batteries has just gotten even better

 Posted by Allan on August 13th, 2013

Berkeley Lab battery researcher Gao Liu and his team earned a coveted R&D 100 Award this year for their invention of an electrically conductive rubbery adhesive that can be mixed with particles of silicon to form a battery’s negative (–) electrode, or anode. Lithium-ion batteries whose anodes are built with this “conducting polymer binder” can have 30 percent more energy storage capacity than those with conventional anodes made with aggregated carbon particles. Now, by literally tinkering at the edges of this new polymer material, researchers have raised its performance another notch.

That was the promise of the technology, says Liu: “In addition to developing this binder, we developed a method to engineer and test ways to improve it. We are continuing to use those tools, and now we are approaching an ideal design.”

In a paper published this month in the on-line edition of the Journal of the American Chemical Society, Wanli Yang, a beamline physicist at Berkeley Lab, and Gao Liu, both lead authors, and their colleagues describe how they modified the original binder, which was already an excellent conductor of electrons, to boost its capacity to transport positively (+) charged lithium ions.  Because the flow of positive and negative charges in a battery is always balanced, the performance limits of the original polymer binder were determined by its less-than-ideal transport of lithium ions.

During a charging cycle, lithium ions are transported within the binder to the embedded silicon particles through the uptake of an electrolyte, which consists primarily of organic solvents filled with lithium ions in solution. By modifying the chemical structure of their original binder — adding “side chains” of ether molecules — they tripled its uptake of electrolyte solution.  As a result of the improved ion flow, the specific capacity of the silicon anode made with the new binder rose to 3,750 mAh/g from the 2,100 mAh/g achieved by the original version. That 80 percent improvement meets the theoretical limit of a silicon anode’s storage capacity. “It means we are using 100 percent of the silicon particles embedded in the conducting polymer binder,” says Liu. “That makes it pretty close to the ‘ideal’ binder.”

TOC2

Figure: Schematic of an ideal binder system for high-capacity battery electrodes. The binder developed in this work features optimized electric conductivity in lithium environment, strong mechanical adhesion, ductility, and high electrolyte uptake. All these optimized functionalities were integrated into one conductive polymer.

Liu’s original polymer binder, which he calls PFM or PFFOMB, was notable for its combined traits of adhesion, elasticity, and electrical conductivity. The adhesion made the polymer stick to particles of silicon, which is preferable to graphite as an anode material because it can store ten times as much charge as carbon. Conductivity was essential, because without it the binder would simply insulate the silicon. The elasticity was crucial, because silicon literally swells to four times its size when it draws in lithium ions during battery charging, and then shrinks back to its original volume upon discharge. After just a few charge/discharge cycles, this breath-like movement would break conventional binders, ruining the battery. PFM’s ability to accommodate this motion solved that problem, making higher-capacity silicon anodes for lithium ion batteries a practical alternative.

The improved binder, which the team calls PEFM, not only enhances lithium ion flow, it also maintains the elasticity and electron conductivity of the original; and as a bonus, the electrical traits of the added side chains actually improve the binder’s adhesion to the silicon particles. “An ideal binder system should provide inherent electronic conductivity, mechanical adhesion and flexibility, and sufficient electrolyte uptake to warrant high ionic conductivity,’’ says Liu. “The polymer we developed meets these challenges of an ideal binder system.”

Liu says his team will continue to fine-tune its conducting polymer binders. The next goal is to find materials that offer comparable performance at lower cost. Significant testing will be required to determine that batteries made with the new silicon composite anodes can last as long as those made with graphite. To fully meet the needs of the next generation of electric vehicles and plug-in hybrids, the improved anodes must be coupled to improved cathodes, separators, electrolytes, and other components to make the truly “ideal” lithium ion batteries of the future.

The work was funded by the Office of Vehicle Technologies of the U.S. Department of Energy, under the Batteries for Advanced Transportation Technologies (BATT) program and by a University of California Discovery Grant.

 

An Update on FLEXLAB Construction

 Posted by Allan on August 12th, 2013

August Special Focus: FLEXLAB

FLEXLAB rendering-USE THIS-PIC -8 LBNL_

A unique new facility for the buildings industry is taking shape at the Lawrence Berkeley National Laboratory (Berkeley Lab). The Facility for Low Energy eXperiments in Buildings (FLEXLAB) is designed to be a national focal point for developing, simulating and testing energy-efficient technologies and strategies for buildings. FLEXLAB users will conduct research and develop technologies at FLEXLAB on single components as well as whole-building integrated design and operation aimed at substantially lowering the energy use, and improving the comfort and efficiency of both new and existing buildings. FLEXLAB is a facility of Berkeley Lab’s Environmental Energy Technologies Division (EETD).

Construction of FLEXLAB began during the fall of 2012, and “it is coming along very well,” says FLEXLAB manager Cindy Regnier. “The project is on time and within budget. All of the testbed structural steel and most of the exterior walls and cladding are up. As of last weekend, we had energized and rotated the rotating testbed.” The construction teams are working on the roof as of this writing, and over the next couple of months, they will be installing mechanical systems. There have been 20,000 hours of construction with no safety incidents to date.

The FLEXLAB team is now planning the commissioning phase of the project so that they will be ready to go when the exterior structures are complete. Building commissioning is the process of checking the systems of a newly constructed building to ensure that they are operating according to design specification—commissioning has been shown by Berkeley Lab’s research to improve the energy efficiency and operational success of buildings. Following commissioning, there will be a period of calibrating the testbed models to operating conditions and defining testbed accuracy.

The interior Lighting and Plug Loads Testbed, built within an existing building at Berkeley Lab, has been complete since fall of 2012, and the first set of experiments, for a project being run by Philips North America, are in the final stages of analysis. The FLEXLAB team has completed the deployment of this testbed’s data acquisition system, which included custom scripting tools, channel configuration and data storage. The same system will be used in the exterior testbed.

“We should see the completion of the Virtual Design and Visualization Testbed by the end of the summer,” says Regnier. This lab will facilitate project design and development in a collaborative setting. It will be equipped with smartboards, which allow the facility’s users to work interactively by displaying and modifying content from their laptops directly to screens visible throughout the room.  This testbed will also be used to visualize the experiments ongoing in the testbeds.

Once the exterior testbed is complete and commissioned, says Regnier, “we are very much looking forward to having a ribbon-cutting ceremony early in 2014, and to working with R&D partners in the buildings industry, government agencies and the academic community.”

For more information about working with FLEXLAB, contact flexlab.info@lbl.gov.

FLEXLAB website: http://flexlab.lbl.gov/

Lighting and Plug Loads Testbed: http://eetd.lbl.gov/news/article/15326/testing-efficient-lighting-and-building-control-solutions-at-berkeley-lab-the-fir

FLEXLAB Industry Preview: http://eetd.lbl.gov/news/article/30459/berkeley-lab-hosts-industry-for-preview-of-first-phase-of-flexlab-a-new-laborator

The Installed Price of Solar Photovoltaic Systems in the U.S. Continues to Decline at a Rapid Pace

 Posted by Allan on August 12th, 2013

Tracking the SunScreen ShotTechnical Contacts: Galen Barbose (510) 495-2593, GLBarbose@lbl.gov; Ryan Wiser (510) 486-5474, RHWiser@lbl.gov

The installed price of solar photovoltaic (PV) power systems in the United States fell substantially in 2012 and through the first half of 2013, according to the latest edition of Tracking the Sun, an annual PV cost tracking report produced by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

Installed prices for PV systems in 2012 fell by a range of roughly $0.30/W to $0.90/W, or 6 to 14 percent, from the prior year, depending on the size of the system.  “This marks the third year in a row of significant price reductions for PV systems in the U.S.,” explains Galen Barbose of Berkeley Lab’s Environmental Energy Technologies Division, one of the report’s co-authors.  Within the first six months of 2013, PV system prices in California fell by an additional 10 to 15 percent, and the report suggests that PV system price reductions in 2013 are on pace to match or exceed those seen in recent years.

The report indicates that the median installed price of PV systems completed in 2012 was $5.30 per Watt (W) for residential and small commercial systems smaller than 10 kilowatts (kW) in size and was $4.60/W for commercial systems of 100 kW or more in size.  Utility-scale systems installed in 2012 registered even lower prices, with prices for systems larger than 10,000 kW generally ranging from $2.50/W to $4.00/W.  The report also highlights the wide variability in PV system pricing, detailing the installed price differences that exist across states and across various types of PV applications and system configurations.

The market for solar PV systems in the United States has grown rapidly over the past decade.  This sixth edition in Berkeley Lab’s Tracking the Sun report series describes historical trends in the installed price of PV in the United States.  The report is based on data from more than 200,000 residential, commercial, and utility-scale PV systems installed between 1998 and 2012 across 29 states, representing roughly 72 percent of all grid-connected PV capacity installed in the United States.  The study is intended to provide policy makers and industry observers with a reliable and detailed set of historical benchmarks for tracking and understanding past trends in the installed price of PV.

Recent PV System Price Reductions Driven by Falling Hardware Costs, While “Soft” Costs Persist

According to the report, recent installed price reductions for PV systems are primarily attributable to steep reductions in the price of PV modules.  From 2008 to 2012, annual average module prices on the global market fell by $2.60/W, representing about 80 percent of the total decline in PV system prices over that period.

Non-module costs – such as inverters, mounting hardware, and the various non-hardware or “soft” costs – have also fallen over the long-term, but have remained relatively flat in recent years.  As a result, they now represent a sizable fraction of the total installed price of PV systems.  This shift in the cost structure of PV systems has heightened the emphasis within the industry and among policymakers on reducing non-module costs.

The report specifically highlights soft costs – which include such things as marketing and customer acquisition, system design, installation labor, and the various costs associated with permitting and inspections – as the most promising target for further PV system price reductions.  “Soft costs are especially important from the perspective of public policy efforts,” Barbose notes.  “Unlike module prices, which are established based on global supply and demand, soft costs can be influenced more directly by local, state and national policies aimed at accelerating deployment and removing market barriers.

Adds co-author Ryan Wiser, also of Berkeley Lab, “There simply are limits to how much further module prices can fall, and so it stands to reason that continued reductions in PV system prices will need to come primarily from the soft cost side.”

PV System Prices in the United States Higher than in Other Major Markets

The report compares PV system pricing in the United States to a number of other major international markets, and finds that U.S. prices are generally higher.  The differences are particularly stark in comparison to Germany, Italy, and Australia, where the price of small residential PV systems installed in 2012 was roughly 40 percent lower than in the United States.

The report attributes much of the difference in PV system pricing to soft costs, citing the fact that the cost of PV modules and other hardware is typically similar across countries.  “These international experiences suggest that deep near-term reductions in soft costs are attainable in the United States,” says report co-author, Naïm Darghouth, also with Berkeley Lab.  He adds further that, “Reductions in soft costs may naturally accompany growth in market size, as we’ve seen in some of the largest markets such as Germany and Italy, though other factors are also clearly important.”

Price Declines for PV System Owners in 2012 Offset by Falling Incentives

Rebates and other forms of cash incentives for residential and commercial PV systems are offered by state agencies and utilities in many regions.  These incentives have declined significantly over time, falling by roughly 85 percent over the past decade.  Within the span of just 2011 to 2012, median cash incentives from state and utility programs fell by $0.40/W to $0.60/W, depending on PV system size.

States and utilities have reduced incentives both in response to, and to encourage further, installed price declines.  Cash incentives provided through state and utility programs have also fallen over time as other sources of financial support for PV projects – most notably, increases in federal tax incentives and the emergence of solar renewable energy certificate (or SREC) markets in a number of states – have become more widely available or lucrative.

Wide Variability in PV System Pricing Observed

The study also highlights the significant variability in PV system pricing.  For example, among PV systems less than 10 kW in size and completed in 2012, 20 percent of systems had an installed price less than $4.50/W while another 20 percent were priced above $6.50/W.

This variability is partly associated with differences in pricing across states, where the median installed price of PV systems less than 10 kW ranged from $3.90/W to $5.90/W in 2012.  The report points to an array of potential underlying drivers for these cross-state pricing differences, including market size, the size of incentives available and level of competition among installers, labor costs, customer characteristics, administrative and regulatory compliance costs, and sales tax exemptions for PV.

The report also examines the variation in PV system pricing across various types of applications and technologies, including: systems with microinverters vs. central inverters, systems with Chinese vs. non-Chinese modules, systems with varying module efficiencies, residential new construction vs. residential retrofit, building-integrated vs. rack-mounted systems, rooftop vs. ground-mounted systems, and tracking vs. fixed-tilt systems.

The report, Tracking the Sun VI: An Historical Summary of the Installed Price of Photovoltaics in the United States from 1998 to 2012, by Galen Barbose, Naïm Darghouth, Samantha Weaver, and Ryan Wiser, may be downloaded from: http://emp.lbl.gov/sites/all/files/lbnl-6350e.pdf.

A webinar presentation of key findings from the report will be conducted on Friday, August 16th at 11:00 am PDT.  Registration for the webinar is at:

https://cc.readytalk.com/cc/s/registrations/new?cid=k5ibcp60p1vb

The research was supported by funding from the U.S. Department of Energy’s Solar Energy Technologies Office of the Office of Energy Efficiency and Renewable Energy.