844 Matching Results

Search Results

Advanced search parameters have been applied.

Telluric and D.C. Resistivity Techniques Applied to the Geophysical Investigation of Basin and Range Geothermal Systems, Part II: A Numberical Model Study of the Dipole-Dipole and Schlumberger Resistivity Methods

Description: This paper is a two-dimensional numerical model study and comparison of the polar dipole-dipole and Schlumberger resistivity arrays. A catalog of dipole-dipole and Schlumberger apparent resistivity pseudo-sections is presented. It is concluded that: for the Schlumberger array, data can be accurately interpreted only if the resistivity structure is horizontally layered, and conductive bodies having a depth of burial greater than their width are not observed; for the dipole-dipole array, complex anomaly patterns unrelated in appearance to the causative structure result from simple models, hence, a familiarity with model results is essential to interpretation of these data.
Date: June 1, 1977
Creator: Beyer, J.H.
Partner: UNT Libraries Government Documents Department

Building Commissioning: A Golden Opportunity for Reducing Energy Costs and Greenhouse-gas Emissions

Description: The aim of commissioning new buildings is to ensure that they deliver, if not exceed, the performance and energy savings promised by their design. When applied to existing buildings, commissioning identifies the almost inevitable 'drift' from where things should be and puts the building back on course. In both contexts, commissioning is a systematic, forensic approach to quality assurance, rather than a technology per se. Although commissioning has earned increased recognition in recent years - even a toehold in Wikipedia - it remains an enigmatic practice whose visibility severely lags its potential. Over the past decade, Lawrence Berkeley National Laboratory has built the world's largest compilation and meta-analysis of commissioning experience in commercial buildings. Since our last report (Mills et al. 2004) the database has grown from 224 to 643 buildings (all located in the United States, and spanning 26 states), from 30 to 100 million square feet of floorspace, and from $17 million to $43 million in commissioning expenditures. The recorded cases of new-construction commissioning took place in buildings representing $2.2 billion in total construction costs (up from 1.5 billion). The work of many more commissioning providers (18 versus 37) is represented in this study, as is more evidence of energy and peak-power savings as well as cost-effectiveness. We now translate these impacts into avoided greenhouse gases and provide new indicators of cost-effectiveness. We also draw attention to the specific challenges and opportunities for high-tech facilities such as labs, cleanrooms, data centers, and healthcare facilities. The results are compelling. We developed an array of benchmarks for characterizing project performance and cost-effectiveness. The median normalized cost to deliver commissioning was $0.30/ft2 for existing buildings and $1.16/ft2 for new construction (or 0.4% of the overall construction cost). The commissioning projects for which data are available revealed over 10,000 energy-related problems, resulting ...
Date: July 16, 2009
Creator: Mills, Evan
Partner: UNT Libraries Government Documents Department

An Analysis of Energy Use on Community College Campuses

Description: This paper provides an analysis of energy use on community college campuses which justifies the introduction of a simple model for describing that energy use. The model is then applied to the data from 80 campuses to determine average values for the parameters of the model. The model can be used to measure the energy savings of conservation programs as well as the cost avoidance associated with those savings. Because the model explicitly takes into account variations in weather, it provides an essential tool for evaluating energy conservation programs.
Date: March 1, 1980
Creator: York, C. M.
Partner: UNT Libraries Government Documents Department


Description: Passage of the Toxic Substances Control Act (TSCA) last year emphasized the urgent need for the formulation of viable criteria and interim standards limiting the exposure of increasingly large segments of the U.S. population to environmental chemical toxicants. Unfortunately, current methods of developing these standards are both time-consuming and costly. The resulting need for a priori predictive techniques to assess the inherent potential of chemicals, such as the halocarbons found in chlorinated waters, for inducing adverse biological effects, has led to the use of a number of analytical methods designed primarily for screening large numbers of chemical compounds before they impose unacceptable environmental hazards, frequently of crisis proportions. Four of the techniques best adapted to dealing with the multifactorial environmental problems of chemical health effects will be briefly described: (1) quantitative structure/activity relationships (QSAR); (2) factor analysis (FA); (3) pattern recognition/artificial intelligence (PR/AI); and (4) molecular connectivity (MC). While it is clear that none provides easy answers, it would appear that the more recent areas of PR and MC both merit more intensive investigation as predictive tools. In particular, the relative simplicity of the MC approach and the possibility of substantially reducing the empirical component are attractive incentives for pursuing further work in this area.
Date: October 1, 1977
Creator: Kland, M.J.
Partner: UNT Libraries Government Documents Department


Description: This report presents the results of a preliminary study concerning the use of high speed turbomachinery in a solar-assisted Rankine cooling cycle. The use of Rankine cycles in solar powered cooling of buildings involves a solar collector to provide energy to heat and vaporize a working fluid. Energy is extracted from this vapor in an expansion engine that is used to drive an air conditioning vapor compressor. In a typical Rankine cycle, the maximum temperature at the inlet to the expander is limited to the temperature of the fluid leaving the solar collector. Because of the relatively low temperature capability associated with inexpensive collectors the efficiency of the Rankine cycle is generally low. This low efficiency results in low coefficient of performance values for the solar powered cooling system. In an effort to improve Rankine cycle efficiency, a solar-assisted approach was presented in NSF report RA-N-75-012, by Dr. Henry Curran. In this approach solar energy vaporizes the working fluid at a low temperature using solar collectors; the vapor is subsequently superheated to a higher temperature using fossil fuel, thereby allowing the potential of much improved Rankine cycle efficiency as compared to a typical cycle. Water was selected as the working fluid to avoid problems associated with chemical stability of organic fluids at high values of expander inlet temperature. A maximum steam temperature of 1100 F was suggested to achieve Rankine cycle efficiencies on the order of 25%. The solar-assisted Rankine cycle was shown by Dr. Curran to offer a substantial cost advantage over conventional absorption machinery. Key elements in achieving this performance improvement are the efficiency of the Rankine cycle vapor expander and the performance of the vapor cycle. In Dr. Curran's study, an expander efficiency of 82% and an air conditioning COP{sub vc} of 4.0 was assumed. Additionally, no ...
Date: December 1, 1976
Creator: Leech, J.
Partner: UNT Libraries Government Documents Department

Analysis of the Relationship Between Vehicle Weight/Size and Safety, and Implications for Federal Fuel Economy Regulation

Description: This report analyzes the relationship between vehicle weight, size (wheelbase, track width, and their product, footprint), and safety, for individual vehicle makes and models. Vehicle weight and footprint are correlated with a correlation coefficient (R{sup 2}) of about 0.62. The relationship is stronger for cars (0.69) than for light trucks (0.42); light trucks include minivans, fullsize vans, truck-based SUVs, crossover SUVs, and pickup trucks. The correlation between wheelbase and track width, the components of footprint, is about 0.61 for all light vehicles, 0.62 for cars and 0.48 for light trucks. However, the footprint data used in this analysis does not vary for different versions of the same vehicle model, as curb weight does; the analysis could be improved with more precise data on footprint for different versions of the same vehicle model. Although US fatality risk to drivers (driver fatalities per million registered vehicles) decreases as vehicle footprint increases, there is very little correlation either for all light vehicles (0.01), or cars (0.07) or trucks (0.11). The correlation between footprint and fatality risks cars impose on drivers of other vehicles is also very low (0.01); for trucks the correlation is higher (0.30), with risk to others increasing as truck footprint increases. Fatality risks reported here do not account for differences in annual miles driven, driver age or gender, or crash location by vehicle type or model. It is difficult to account for these factors using data on national fatal crashes because the number of vehicles registered to, for instance, young males in urban areas is not readily available by vehicle type or model. State data on all police-reported crashes can be used to estimate casualty risks that account for miles driven, driver age and gender, and crash location. The number of vehicles involved in a crash can act as a ...
Date: March 2, 2010
Creator: Wenzel, Thomas P.
Partner: UNT Libraries Government Documents Department


Description: Planning a rational energy future requires anticipating the environmental consequences of various technologies. This is difficult to do with precision as the effects of pollutants are often determined by interactions between and among complex physical (abiotic) and biological (biotic) systems. A given pollutant may affect human beings through direct exposure or indirectly through inducing changes to biological systems which humans need to utilize. The concentration of a toxin in the food chain or the destruction of organisms necessary for the maintenance of high quality water are examples of indirect effects. Pollutants can be transformed and/or degraded as they establish residence in various components of an ecosystem. Anticipation and amelioration of pollutant effects involves the integration of a vast range of data. This data includes: (1) physical and chemical characterization cf the pollutant as it enters the environment; (2) determining effects on the various components (biotic and abiotic) within the context of the functioning ecosystem of interest; (3) transformation in movements and/or degradation of the pollutant within that ecosystem and within specific organisms and physical components; and (4) determining a detailed biochemical and biological picture of the interactions of pollutants with particular organisms and/or their cellular components judged salient for various processes. The major programs described below are designed to answer parts of the above fundamental questions relevant to pollutants generated by energy related technologies. Their emphasis is on anticipating consequences to the biological components of various ecosystems. The work ranges from studies involving parts of a single cell (the membranes) to studies involving the whole ecosystem (in the pelagic zone of a lake). The programs take advantage of expertise and technical abilities present at LBL. Two small exploratory projects which were of brief duration and not related to anticipating biological effects of pollutants are included in this section. They concern ...
Date: October 1, 1980
Creator: Authors, Various
Partner: UNT Libraries Government Documents Department

Comments on the Joint Proposed Rulemaking to Establish Light-Duty Vehicle Greenhouse Gas Emission Standards and Corporate Average Fuel Economy Standards

Description: I appreciate the opportunity to provide comments on the joint rulemaking to establish greenhouse gas emission and fuel economy standards for light-duty vehicles. My comments are directed at the choice of vehicle footprint as the attribute by which to vary fuel economy and greenhouse gas emission standards, in the interest of protecting vehicle occupants from death or serious injury. I have made several of these points before when commenting on previous NHTSA rulemakings regarding CAFE standards and safety. The comments today are mine alone, and do not necessarily represent the views of the US Department of Energy, Lawrence Berkeley National Laboratory, or the University of California. My comments can be summarized as follows: (1) My updated analysis of casualty risk finds that, after accounting for drivers and crash location, there is a wide range in casualty risk for vehicles with the same weight or footprint. This suggests that reducing vehicle weight or footprint will not necessarily result in increased fatalities or serious injuries. (2) Indeed, the recent safety record of crossover SUVs indicates that weight reduction in this class of vehicles resulted in a reduction in fatality risks. (3) Computer crash simulations can pinpoint the effect of specific design changes on vehicle safety; these analyses are preferable to regression analyses, which rely on historical vehicle designs, and cannot fully isolate the effect of specific design changes, such as weight reduction, on crash outcomes. (4) There is evidence that automakers planned to build more large light trucks in response to the footprint-based light truck CAFE standards. Such an increase in the number of large light trucks on the road may decrease, rather than increase, overall safety.
Date: October 27, 2009
Creator: Wenzel, Thomas P
Partner: UNT Libraries Government Documents Department

Telluric and D.C. Resistivity Techniques Applied to the Geophysical Investigation of Basin and Range Geothermal Systems, Part III: The Analysis of Data From Grass Valley, Nevada

Description: This paper contains a detailed interpretation of E-field ratio telluric, bipole-dipole resistivity mapping, and dipole-dipole resistivity data obtained in the course of geophysical exploration of the Leach Hot Springs area of Grass Valley, Nevada. Several areas are singled out as being worthy of further investigation of their geothermal potential. Comparison of the three electrical exploration techniques indicates that: the bipole-dipole resistivity mapping method is the least useful; the dipole-dipole resistivity method can be very useful, but is, for practical purposes, exceptionally expensive and difficult to interpret; the E-field ratio telluric method can be a highly successful reconnaissance technique for delineating structures and relating the resistivities of different regions within the survey area.
Date: June 1, 1977
Creator: Beyer, J.H.
Partner: UNT Libraries Government Documents Department

Driving Demand for Home Energy Improvements: Motivating residential customers to invest in comprehensive upgrades that eliminate energy waste, avoid high utility bills, and spur the economy

Description: Policy makers and program designers in the U.S. and abroad are deeply concerned with the question of how to scale up energy efficiency to a level that is commensurate both to the scale of the energy and climate challenges we face, and to the potential for energy savings that has been touted for decades. When policy makers ask what energy efficiency can do, the answers usually revolve around the technical and economic potential of energy efficiency - they rarely hone in on the element of energy demand that matters most for changing energy usage in existing homes: the consumer. A growing literature is concerned with the behavioral underpinnings of energy consumption. We examine a narrower, related subject: How can millions of Americans be persuaded to divert valued time and resources into upgrading their homes to eliminate energy waste, avoid high utility bills, and spur the economy? With hundreds of millions of public dollars flowing into incentives, workforce training, and other initiatives to support comprehensive home energy improvements, it makes sense to review the history of these programs and begin gleaning best practices for encouraging comprehensive home energy improvements. Looking across 30 years of energy efficiency programs that targeted the residential market, many of the same issues that confronted past program administrators are relevant today: How do we cost-effectively motivate customers to take action? Who can we partner with to increase program participation? How do we get residential efficiency programs to scale? While there is no proven formula - and only limited success to date with reliably motivating large numbers of Americans to invest in comprehensive home energy improvements, especially if they are being asked to pay for a majority of the improvement costs - there is a rich and varied history of experiences that new programs can draw upon. Our ...
Date: September 20, 2010
Creator: Fuller, Merrian C.
Partner: UNT Libraries Government Documents Department

Home Network Technologies and Automating Demand Response

Description: Over the past several years, interest in large-scale control of peak energy demand and total consumption has increased. While motivated by a number of factors, this interest has primarily been spurred on the demand side by the increasing cost of energy and, on the supply side by the limited ability of utilities to build sufficient electricity generation capacity to meet unrestrained future demand. To address peak electricity use Demand Response (DR) systems are being proposed to motivate reductions in electricity use through the use of price incentives. DR systems are also be design to shift or curtail energy demand at critical times when the generation, transmission, and distribution systems (i.e. the 'grid') are threatened with instabilities. To be effectively deployed on a large-scale, these proposed DR systems need to be automated. Automation will require robust and efficient data communications infrastructures across geographically dispersed markets. The present availability of widespread Internet connectivity and inexpensive, reliable computing hardware combined with the growing confidence in the capabilities of distributed, application-level communications protocols suggests that now is the time for designing and deploying practical systems. Centralized computer systems that are capable of providing continuous signals to automate customers reduction of power demand, are known as Demand Response Automation Servers (DRAS). The deployment of prototype DRAS systems has already begun - with most initial deployments targeting large commercial and industrial (C & I) customers. An examination of the current overall energy consumption by economic sector shows that the C & I market is responsible for roughly half of all energy consumption in the US. On a per customer basis, large C & I customers clearly have the most to offer - and to gain - by participating in DR programs to reduce peak demand. And, by concentrating on a small number of relatively sophisticated energy ...
Date: December 1, 2009
Creator: McParland, Charles
Partner: UNT Libraries Government Documents Department

Methods for Analyzing Electric Load Shape and its Variability

Description: Current methods of summarizing and analyzing electric load shape are discussed briefly and compared. Simple rules of thumb for graphical display of load shapes are suggested. We propose a set of parameters that quantitatively describe the load shape in many buildings. Using the example of a linear regression model to predict load shape from time and temperature, we show how quantities such as the load?s sensitivity to outdoor temperature, and the effectiveness of demand response (DR), can be quantified. Examples are presented using real building data.
Date: May 12, 2010
Creator: Price, Philip
Partner: UNT Libraries Government Documents Department

Improved Modeling and Understanding of Diffusion-Media Wettability on Polymer-Electrolyte-Fuel-Cell Performance

Description: A macroscopic-modeling methodology to account for the chemical and structural properties of fuel-cell diffusion media is developed. A previous model is updated to include for the first time the use of experimentally measured capillary pressure -- saturation relationships through the introduction of a Gaussian contact-angle distribution into the property equations. The updated model is used to simulate various limiting-case scenarios of water and gas transport in fuel-cell diffusion media. Analysis of these results demonstrate that interfacial conditions are more important than bulk transport in these layers, where the associated mass-transfer resistance is the result of higher capillary pressures at the boundaries and the steepness of the capillary pressure -- saturation relationship. The model is also used to examine the impact of a microporous layer, showing that it dominates the response of the overall diffusion medium. In addition, its primary mass-transfer-related effect is suggested to be limiting the water-injection sites into the more porous gas-diffusion layer.
Date: March 5, 2010
Creator: Weber, Adam
Partner: UNT Libraries Government Documents Department

Financial Innovation Among the Community Wind Sector in the United States

Description: In the relatively brief history of utility-scale wind generation, the 'community wind' sector - defined here as consisting of relatively small utility-scale wind power projects that are at least partly owned by one or more members of the local community - has played a vitally important role as a 'test bed' or 'proving ground' for wind turbine manufacturers. In the 1980s and 1990s, for example, Vestas and other now-established European wind turbine manufacturers relied heavily on community wind projects in Scandinavia and Germany to install - and essentially field-test - new turbine designs. The fact that orders from community wind projects seldom exceeded more than a few turbines at a time enabled the manufacturers to correct any design flaws or manufacturing defects fairly rapidly, and without the risk of extensive (and expensive) serial defects that can accompany larger orders. Community wind has been slower to take root in the United States - the first such projects were installed in the state of Minnesota around the year 2000. Just as in Europe, however, the community wind sector in the U.S. has similarly served as a proving ground - but in this case for up-and-coming wind turbine manufacturers that are trying to break into the broader U.S. wind power market. For example, community wind projects have deployed the first U.S. installations of wind turbines from Suzlon (in 2003), DeWind (2008), Americas Wind Energy (2008) and later Emergya Wind Technologies (2010),1 Goldwind (2009), AAER/Pioneer (2009), Nordic Windpower (2010), Unison (2010), and Alstom (2011). Just as it has provided a proving ground for new turbines, so too has the community wind sector in the United States served as a laboratory for experimentation with innovative new financing structures. For example, a variation of one of the most common financing arrangements in the U.S. wind market ...
Date: January 19, 2011
Creator: Bolinger, Mark
Partner: UNT Libraries Government Documents Department

Max Tech and Beyond: Fluorescent Lamps

Description: Fluorescent lamps are the most widely used artificial light source today, responsible for approximately 70% of the lumens delivered to our living spaces globally. The technology was originally commercialized in the 1930's, and manufacturers have been steadily improving the efficacy of these lamps over the years through modifications to the phosphors, cathodes, fill-gas, operating frequency, tube diameter and other design attributes. The most efficient commercially available fluorescent lamp is the 25 Watt T5 lamp. This lamp operates at 114-116 lumens per watt while also providing good color rendering and more than 20,000 hours of operating life. Industry experts interviewed indicated that while this lamp is the most efficient in the market today, there is still a further 10 to 14% of potential improvements that may be introduced to the market over the next 2 to 5 years. These improvements include further developments in phosphors, fill-gas, cathode coatings and ultraviolet (UV) reflective glass coatings. The commercialization of these technology improvements will combine to bring about efficacy improvements that will push the technology up to a maximum 125 to 130 lumens per watt. One critical issue raised by researchers that may present a barrier to the realization of these improvements is the fact that technology investment in fluorescent lamps is being reduced in order to prioritize research into light emitting diodes (LEDs) and ceramic metal halide high intensity discharge (HID) lamps. Thus, it is uncertain whether these potential efficacy improvements will be developed, patented and commercialized. The emphasis for premium efficacy will continue to focus on T5 lamps, which are expected to continue to be marketed along with the T8 lamp. Industry experts highlighted the fact that an advantage of the T5 lamp is the fact that it is 40% smaller and yet provides an equivalent lumen output to that of a ...
Date: April 1, 2012
Creator: Scholand, Michael
Partner: UNT Libraries Government Documents Department

Max Tech and Beyond: High-Intensity Discharge Lamps

Description: High-intensity discharge (HID) lamps are most often found in industrial and commercial applications, and are the light source of choice in street and area lighting, and sports stadium illumination. HID lamps are produced in three types - mercury vapor (MV), high pressure sodium (HPS) and metal halide (MH). Of these, MV and MH are considered white-light sources (although the MV exhibits poor color rendering) and HPS produces a yellow-orange color light. A fourth lamp, low-pressure sodium (LPS), is not a HID lamp by definition, but it is used in similar applications and thus is often grouped with HID lamps. With the notable exception of MV which is comparatively inefficient and in decline in the US from both a sales and installed stock point of view; HPS, LPS and MH all have efficacies over 100 lumens per watt. The figure below presents the efficacy trends over time for commercially available HID lamps and LPS, starting with MV and LPS in 1930's followed by the development of HPS and MH in the 1960's. In HID lamps, light is generated by creating an electric arc between two electrodes in an arc tube. The particles in the arc are partially ionized, making them electrically conductive, and a light-emitting 'plasma' is created. This arc occurs within the arc tube, which for most HID lamps is enclosed within an evacuated outer bulb that thermally isolates and protects the hot arc tube from the surroundings. Unlike a fluorescent lamp that produces visible light through down-converting UV light with phosphors, the arc itself is the light source in an HID lamp, emitting visible radiation that is characteristic of the elements present in the plasma. Thus, the mixture of elements included in the arc tube is one critical factor determining the quality of the light emitted from the lamp, ...
Date: April 1, 2012
Creator: Scholand, Michael
Partner: UNT Libraries Government Documents Department

Measuring Advances in HVAC Distribution System Design

Description: Substantial commercial building energy savings have been achieved by improving the performance of the HV AC distribution system. The energy savings result from distribution system design improvements, advanced control capabilities, and use of variable-speed motors. Yet, much of the commercial building stock remains equipped with inefficient systems. Contributing to this is the absence of a definition for distribution system efficiency as well as the analysis methods for quantifying performance. This research investigates the application of performance indices to assess design advancements in commercial building thermal distribution systems. The index definitions are based on a first and second law of thermodynamics analysis of the system. The second law or availability analysis enables the determination of the true efficiency of the system. Availability analysis is a convenient way to make system efficiency comparisons since performance is evaluated relative to an ideal process. A TRNSYS simulation model is developed to analyze the performance of two distribution system types, a constant air volume system and a variable air volume system, that serve one floor of a large office building. Performance indices are calculated using the simulation results to compare the performance of the two systems types in several locations. Changes in index values are compared to changes in plant energy, costs, and carbon emissions to explore the ability of the indices to estimate these quantities.
Date: May 1, 1998
Creator: Franconi, E.
Partner: UNT Libraries Government Documents Department

Commissioning: A Highly Cost-Effective Building Energy Management Strategy

Description: Quality assurance and optimization are essential elements of any serious technological endeavor, including efforts to improve energy efficiency. Commissioning is an important tool in this respect. The aim of commissioning new buildings is to ensure that they deliver-if not exceed-the performance and energy savings promised by their design. When applied to existing buildings, one-time or repeated commissioning (often called retrocommissioning) identifies the almost inevitable drift in energy performance and puts the building back on course, often surpassing the original design intent. In both contexts, commissioning is a systematic, forensic approach to improving performance, rather than a discrete technology.
Date: January 6, 2011
Creator: Mills, Evan
Partner: UNT Libraries Government Documents Department


Description: The research reported in this volume was undertaken during FY 1979 within the Energy & Environment Division of the Lawrence Berkeley Laboratory. This volume will comprise a section of the Energy & Environment Division 1979 Annual Report, to be published in the summer of 1980. Work reported relate to: thermal performance of building envelopes; building ventilation and indoor air quality; a computer program for predicting energy use in buildings; study focused specifically on inherently energy intensive hospital buildings; energy efficient windows and lighting; potential for energy conservation and savings in the buildings sector; and evaluation of energy performance standards for residential buildings.
Date: December 1, 1979
Creator: Authors, Various
Partner: UNT Libraries Government Documents Department


Description: One of the important aspects of America's painful adjustment to energy realities since 1973 has been an overwhelming effort to look carefully at how we use energy. Much to our surprise there was tremendous slack in energy use at home even before the Oil Embargo, slack that could have been eliminated profitably. One suggestion that there was waste in our economy came from careful inspection of energy use elsewhere. But early discussion of energy use in other lands has been marred by many distortions and misunderstandings, not only on the part of those who tend to doubt the potential for energy conservation but even among conservation's strongest supporters. This misunderstanding arises from comparisons of energy use and gross national product, two quantities that have charmed correlators and energy statisticians for decades. Though serious work cannot be based upon relationships between two such aggregated quantities, it is useful to review some of the popular myths surrounding energy comparisons among countries.
Date: November 1, 1978
Creator: Schipper, L.
Partner: UNT Libraries Government Documents Department

Effect of Environmental Factors on Sulfur Gas Emissions from Drywall

Description: Problem drywall installed in U.S. homes is suspected of being a source of odorous and potentially corrosive indoor pollutants. The U.S. Consumer Product Safety Commission's (CPSC) investigation of problem drywall incorporates three parallel tracks: (1) evaluating the relationship between the drywall and reported health symptoms; (2) evaluating the relationship between the drywall and electrical and fire safety issues in affected homes; and (3) tracing the origin and the distribution of the drywall. To assess the potential impact on human health and to support testing for electrical and fire safety, the CPSC has initiated a series of laboratory tests that provide elemental characterization of drywall, characterization of chemical emissions, and in-home air sampling. The chemical emission testing was conducted at Lawrence Berkeley National Laboratory (LBNL). The LBNL study consisted of two phases. In Phase 1 of this study, LBNL tested thirty drywall samples provided by CPSC and reported standard emission factors for volatile organic compounds (VOCs), aldehydes, reactive sulfur gases (RSGs) and volatile sulfur compounds (VSCs). The standard emission factors were determined using small (10.75 liter) dynamic test chambers housed in a constant temperature environmental chamber. The tests were all run at 25 C, 50% relative humidity (RH) and with an area-specific ventilation rate of {approx}1.5 cubic meters per square meter of emitting surface per hour [m{sup 3}/m{sup 2}/h]. The thirty samples that were tested in Phase 1 included seventeen that were manufactured in China in 2005, 2006 and 2009, and thirteen that were manufactured in North America in 2009. The measured emission factors for VOCs and aldehydes were generally low and did not differ significantly between the Chinese and North American drywall. Eight of the samples tested had elevated emissions of volatile sulfur-containing compounds with total RSG emission factors between 32 and 258 micrograms per square meter per hour [{micro}g/m{sup ...
Date: August 20, 2011
Creator: Maddalena, Randy
Partner: UNT Libraries Government Documents Department

Analysis of Casualty Risk per Police-Reported Crash for Model Year 2000 to 2004 Vehicles, using Crash Data from Five States

Description: In this report we compare two measures of driver risks: fatality risk per vehicle registration-year, and casualty (fatality plus serious injury) risk per police-reported crash. Our analysis is based on three sets of data from five states (Florida, Illinois, Maryland, Missouri, and Pennsylvania): data on all police-reported crashes involving model year 2000 to 2004 vehicles; 2005 county-level vehicle registration data by vehicle model year and make/model; and odometer readings from vehicle emission inspection and maintenance (I/M) programs conducted in urban areas of four of the five states (Florida does not have an I/M program). The two measures of risk could differ for three reasons: casualty risks are different from fatality risk; risks per vehicle registration-year are different from risks per crash; and risks estimated from national data are different from risks from the five states analyzed here. We also examined the effect of driver behavior, crash location, and general vehicle design on risk, as well as sources of potential bias in using the crash data from five states.
Date: March 20, 2011
Creator: Wenzel, Tom
Partner: UNT Libraries Government Documents Department

Austin's Home Performance with Energy Star Program: Making a Compelling Offer to a Financial Institution Partner

Description: Launched in 2006, over 8,700 residential energy upgrades have been completed through Austin Energy's Home Performance with Energy Star (HPwES) program. The program's lending partner, Velocity Credit Union (VCU) has originated almost 1,800 loans, totaling approximately $12.5 million. Residential energy efficiency loans are typically small, and expensive to originate and service relative to larger financing products. National lenders have been hesitant to deliver attractive loan products to this small, but growing, residential market. In response, energy efficiency programs have found ways to partner with local and regional banks, credit unions, community development finance institutions (CDFIs) and co-ops to deliver energy efficiency financing to homeowners. VCU's experience with the Austin Energy HPwES program highlights the potential benefits of energy efficiency programs to a lending partner.
Date: March 18, 2011
Creator: Zimring, Mark
Partner: UNT Libraries Government Documents Department

Co-Simulation of Building Energy and Control Systems with the Building Controls Virtual Test Bed

Description: This article describes the implementation of the Building Controls Virtual Test Bed (BCVTB). The BCVTB is a software environment that allows connecting different simulation programs to exchange data during the time integration, and that allows conducting hardware in the loop simulation. The software architecture is a modular design based on Ptolemy II, a software environment for design and analysis of heterogeneous systems. Ptolemy II provides a graphical model building environment, synchronizes the exchanged data and visualizes the system evolution during run-time. The BCVTB provides additions to Ptolemy II that allow the run-time coupling of different simulation programs for data exchange, including EnergyPlus, MATLAB, Simulink and the Modelica modelling and simulation environment Dymola. The additions also allow executing system commands, such as a script that executes a Radiance simulation. In this article, the software architecture is presented and the mathematical model used to implement the co-simulation is discussed. The simulation program interface that the BCVTB provides is explained. The article concludes by presenting applications in which different state of the art simulation programs are linked for run-time data exchange. This link allows the use of the simulation program that is best suited for the particular problem to model building heat transfer, HVAC system dynamics and control algorithms, and to compute a solution to the coupled problem using co-simulation.
Date: August 22, 2010
Creator: Wetter, Michael
Partner: UNT Libraries Government Documents Department