1,279 Matching Results

Search Results

Advanced search parameters have been applied.

Quiet planting in the locked constraints satisfaction problems

Description: We study the planted ensemble of locked constraint satisfaction problems. We describe the connection between the random and planted ensembles. The use of the cavity method is combined with arguments from reconstruction on trees and first and second moment considerations; in particular the connection with the reconstruction on trees appears to be crucial. Our main result is the location of the hard region in the planted ensemble, thus providing hard satisfiable benchmarks. In a part of that hard region instances have with high probability a single satisfying assignment.
Date: January 1, 2009
Creator: Zdeborova, Lenka & Krzakala, Florent
Partner: UNT Libraries Government Documents Department

Hospital Energy Benchmarking Guidance - Version 1.0

Description: This document describes an energy benchmarking framework for hospitals. The document is organized as follows. The introduction provides a brief primer on benchmarking and its application to hospitals. The next two sections discuss special considerations including the identification of normalizing factors. The presentation of metrics is preceded by a description of the overall framework and the rationale for the grouping of metrics. Following the presentation of metrics, a high-level protocol is provided. The next section presents draft benchmarks for some metrics; benchmarks are not available for many metrics owing to a lack of data. This document ends with a list of research needs for further development.
Date: September 8, 2009
Creator: Singer, Brett C.
Partner: UNT Libraries Government Documents Department

Simulations of RF capture with barrier bucket in booster at injection

Description: As part of the effort to increase the number of ions per bunch in RHIC, a new scheme for RF capture of EBIS ions in Booster at injection has been developed. The scheme was proposed by M. Blaskiewicz and J.M. Brennan. It employs a barrier bucket to hold a half turn of beam in place during capture into two adjacent harmonic 4 buckets. After acceleration, this allows for 8 transfers of 2 bunches from Booster into 16 buckets on the AGS injection porch. During the Fall of 2011 the necessary hardware was developed and implemented by the RF and Controls groups. The scheme is presently being commissioned by K.L. Zeno with Au32+ ions from EBIS. In this note we carry out simulations of the RF capture. These are meant to serve as benchmarks for what can be achieved in practice. They also allow for an estimate of the longitudinal emittance of the bunches on the AGS injection porch.
Date: January 23, 2012
Creator: Gardner, C.J.
Partner: UNT Libraries Government Documents Department

Action-Oriented Benchmarking: Using the CEUS Database to Benchmark Commercial Buildings in California

Description: The 2006 Commercial End Use Survey (CEUS) database developed by the California Energy Commission is a far richer source of energy end-use data for non-residential buildings than has previously been available and opens the possibility of creating new and more powerful energy benchmarking processes and tools. In this article--Part 2 of a two-part series--we describe the methodology and selected results from an action-oriented benchmarking approach using the new CEUS database. This approach goes beyond whole-building energy benchmarking to more advanced end-use and component-level benchmarking that enables users to identify and prioritize specific energy efficiency opportunities - an improvement on benchmarking tools typically in use today.
Date: February 1, 2008
Creator: Mathew, Paul; Mills, Evan; Bourassa, Norman & Brook, Martha
Partner: UNT Libraries Government Documents Department

Using SPARK as a Solver for Modelica

Description: Modelica is an object-oriented acausal modeling language that is well positioned to become a de-facto standard for expressing models of complex physical systems. To simulate a model expressed in Modelica, it needs to be translated into executable code. For generating run-time efficient code, such a translation needs to employ algebraic formula manipulations. As the SPARK solver has been shown to be competitive for generating such code but currently cannot be used with the Modelica language, we report in this paper how SPARK's symbolic and numerical algorithms can be implemented in OpenModelica, an open-source implementation of a Modelica modeling and simulation environment. We also report benchmark results that show that for our air flow network simulation benchmark, the SPARK solver is competitive with Dymola, which is believed to provide the best solver for Modelica.
Date: June 30, 2008
Creator: Wetter, Michael; Wetter, Michael; Haves, Philip; Moshier, Michael A. & Sowell, Edward F.
Partner: UNT Libraries Government Documents Department

Toxicological Benchmarks for Screening Potential Contaminants of Concern for Effects on Soil and Litter Invertebrates and Heterotrophic Process

Description: This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.
Date: January 1, 1994
Creator: Will, M. E.
Partner: UNT Libraries Government Documents Department

Application of Global Optimization to the Estimation of Surface-Consistent Residual Statics

Description: Since the objective function that is used to estimate surface-consistent residual statics can have many local maxima, a global optimization method is required to find the optimum values for the residual statics. As reported in several recent papers, we had developed a new method (TRUST) for solving global optimization problems and had demonstrated it was superior to all competing methods for a standard set of nonconvex benchmark problems. The residual statics problem can be very large with hundreds or thousands of parameters, and large global optimization problems are much harder to solve than small problems. To solve the very challenging residual statics problem, we have made several significant advances in the mathematical description of the residual statics problem (derivation of two novel stack power bounds and disaggregation of the original problem into a large number of small problems). Using the enhanced version of TRUST, we have performed extensive simulations on a realistic sample problem that had been artificially created by large static disruptions. Our simulations have demonstrated that TRUST can reach many plausible distinct ''solutions'' that could not be discovered by more conventional approaches. An unexpected result was that high values of the stack power may be eliminate cycle skips.
Date: October 1999
Creator: Reister, D. B.; Oblow, E. M.; Barhen, J. & DuBose, J. B.
Partner: UNT Libraries Government Documents Department

Architecture independent performance characterization andbenchmarking for scientific applications

Description: A simple, tunable, synthetic benchmark with a performance directly related to applications would be of great benefit to the scientific computing community. In this paper, we present a novel approach to develop such a benchmark. The initial focus of this project is on data access performance of scientific applications. First a hardware independent characterization of code performance in terms of address streams is developed. The parameters chosen to characterize a single address stream are related to regularity, size, spatial, and temporal locality. These parameters are then used to implement a synthetic benchmark program that mimics the performance of a corresponding code. To test the validity of our approach we performed experiments using five test kernels on six different platforms. The performance of most of our test kernels can be approximated by a single synthetic address stream. However in some cases overlapping two address streams is necessary to achieve a good approximation.
Date: August 31, 2004
Creator: Strohmaier, Erich & Shan, Hongzhang
Partner: UNT Libraries Government Documents Department

Spirit Leveling in South Carolina: Part 1. Northern South Carolina, 1896-1938

Description: From introduction: report.-This bulletin, which is published in two parts, contains the complete results of all spirit leveling done in South Carolina by the Geological Survey of the United States Department of the Interior, including those heretofore published.1 The 34th parallel of latitude, passing through Columbia, serves to divide the State into two sections, each of which is represented by one of the parts of the bulletin. Part 1 deals with the section lying north of the 34th parallel, designated as northern South Carolina, and part 2 deals with the section lying south of that parallel, designated as southern South Carolina.
Date: 1939
Creator: Staack, J. G.
Partner: UNT Libraries Government Documents Department

Spirit Leveling in South Carolina: Part 2. Southern South Carolina, 1896-1938

Description: From introduction: This bulletin, which is published in two parts, contains the complete results of all spirit leveling done in South Carolina by the Geological Survey of the United States Department of the Interior, including those heretofore published.' The 34th parallel of latitude, passing through Columbia, serves to divide the State into two sections, each of which is represented by one of the parts of the bulletin. Part 1 deals with the section lying north of the 34th parallel, designated as northern South Carolina, and part 2 deals with the section lying south of that parallel, designated as southern South Carolina. In each part descriptions of the points for which figures of elevation have been determined are listed according to the quadrangles in which the points occur, and the quadrangles are arranged in alphabetic order.
Date: 1940
Creator: Staack, J. G.
Partner: UNT Libraries Government Documents Department

The International Reactor Physics Experiment Evaluation Project (IRPHEP)

Description: Since the beginning of the Nuclear Power industry, numerous experiments concerned with nuclear energy and technology have been performed at different research laboratories, worldwide. These experiments required a large investment in terms of infrastructure, expertise, and cost; however, many were performed without a high degree of attention to archival of results for future use. The degree and quality of documentation varies greatly. There is an urgent need to preserve integral reactor physics experimental data, including measurement methods, techniques, and separate or special effects data for nuclear energy and technology applications and the knowledge and competence contained therein. If the data are compromised, it is unlikely that any of these experiments will be repeated again in the future. The International Reactor Physics Evaluation Project (IRPhEP) was initiated, as a pilot activity in 1999 by the by the Organization of Economic Cooperation and Development (OECD) Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC). The project was endorsed as an official activity of the NSC in June of 2003. The purpose of the IRPhEP is to provide an extensively peer reviewed set of reactor physics related integral benchmark data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next generation reactors and establish the safety basis for operation of these reactors. A short history of the IRPhEP is presented and its purposes are discussed in this paper. Accomplishments of the IRPhEP, including the first publication of the IRPhEP Handbook, are highlighted and the future of the project outlined.
Date: September 1, 2006
Creator: Briggs, J. Blair; Sartori, Enrico & Scott, Lori
Partner: UNT Libraries Government Documents Department

Benchmark Data Through The International Reactor Physics Experiment Evaluation Project (IRPHEP)

Description: The International Reactor Physics Experiments Evaluation Project (IRPhEP) was initiated by the Organization for Economic Cooperation and Development (OECD) Nuclear Energy Agency’s (NEA) Nuclear Science Committee (NSC) in June of 2002. The IRPhEP focus is on the derivation of internationally peer reviewed benchmark models for several types of integral measurements, in addition to the critical configuration. While the benchmarks produced by the IRPhEP are of primary interest to the Reactor Physics Community, many of the benchmarks can be of significant value to the Criticality Safety and Nuclear Data Communities. Benchmarks that support the Next Generation Nuclear Plant (NGNP), for example, also support fuel manufacture, handling, transportation, and storage activities and could challenge current analytical methods. The IRPhEP is patterned after the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and is closely coordinated with the ICSBEP. This paper highlights the benchmarks that are currently being prepared by the IRPhEP that are also of interest to the Criticality Safety Community. The different types of measurements and associated benchmarks that can be expected in the first publication and beyond are described. The protocol for inclusion of IRPhEP benchmarks as ICSBEP benchmarks and for inclusion of ICSBEP benchmarks as IRPhEP benchmarks is detailed. The format for IRPhEP benchmark evaluations is described as an extension of the ICSBEP format. Benchmarks produced by the IRPhEP add new dimension to criticality safety benchmarking efforts and expand the collection of available integral benchmarks for nuclear data testing. The first publication of the "International Handbook of Evaluated Reactor Physics Benchmark Experiments" is scheduled for January of 2006.
Date: September 1, 2005
Creator: Briggs, J. Blair & Sartori, Dr. Enrico
Partner: UNT Libraries Government Documents Department


Description: Interest in high-quality integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of next generation reactor and advanced fuel cycle concepts. The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) continue to expand their efforts and broaden their scope to identify, evaluate, and provide integral benchmark data for method and data validation. Benchmark model specifications provided by these two projects are used heavily by the international reactor physics, nuclear data, and criticality safety communities. Thus far, 14 countries have contributed to the IRPhEP, and 20 have contributed to the ICSBEP. The status of the IRPhEP and ICSBEP is discussed in this paper, and the future of the two projects is outlined and discussed. Selected benchmarks that have been added to the IRPhEP and ICSBEP handbooks since PHYSOR’06 are highlighted, and the future of the two projects is discussed.
Date: September 1, 2008
Creator: Briggs, J. Blair; Scott, Lori; Sartori, Enrico & Rugama, Yolanda
Partner: UNT Libraries Government Documents Department

Can Deployment of Renewable Energy and Energy Efficiency PutDownward Pressure on Natural Gas Prices

Description: High and volatile natural gas prices have increasingly led to calls for investments in renewable energy and energy efficiency. One line of argument is that deployment of these resources may lead to reductions in the demand for and price of natural gas. Many recent U.S.-based modeling studies have demonstrated that this effect could provide significant consumer savings. In this article we evaluate these studies, and benchmark their findings against economic theory, other modeling results, and a limited empirical literature. We find that many uncertainties remain regarding the absolute magnitude of this effect, and that the reduction in natural gas prices may not represent an increase in aggregate economic wealth. Nonetheless, we conclude that many of the studies of the impact of renewable energy and energy efficiency on natural gas prices appear to have represented this effect within reason, given current knowledge. These studies specifically suggest that a 1% reduction in U.S. natural gas demand could lead to long-term average wellhead price reductions of 0.8% to 2%, and that each megawatt-hour of renewable energy and energy efficiency may benefit natural gas consumers to the tune of at least $7.5 to $20.
Date: June 1, 2005
Creator: Wiser, Ryan & Bolinger, Mark
Partner: UNT Libraries Government Documents Department

Particle-in-Cell Calculationsof the Electron Cloud in the ILCPositron Damping Ring Wigglers

Description: The self-consistent code suite WARP-POSINST is being used to study electron cloud effects in the ILC positron damping ring wiggler. WARP is a parallelized, 3D particle-in-cell code which is fully self-consistent for all species. The POSINST models for the production of photoelectrons and secondary electrons are used to calculate electron creation. Mesh refinement and a moving reference frame for the calculation will be used to reduce the computer time needed by several orders of magnitude. We present preliminary results for cloud buildup showing 3D electron effects at the nulls of the vertical wiggler field. First results from a benchmark of WARP-POSINST vs. POSINST are also discussed.
Date: July 1, 2007
Creator: Celata, C.M.; Furman, M.A.; Vay, J.-L. & Grote, D.P.
Partner: UNT Libraries Government Documents Department

Technology and Cost of the Model Year (MY) 2007 Toyota Camry HEV Final Report

Description: The Oak Ridge National Laboratory (ORNL) provides research and development (R&D) support to the Department of Energy on issues related to the cost and performance of hybrid vehicles. ORNL frequently benchmarks its own research against commercially available hybrid components currently used in the market. In 2005 we completed a detailed review of the cost of the second generation Prius hybrid. This study examines the new 2007 Camry hybrid model for changes in technology and cost relative to the Prius. The work effort involved a detailed review of the Camry hybrid and the system control strategy to identify the hybrid components used in the drive train. Section 2 provides this review while Section 3 presents our detailed evaluation of the specific drive train components and their cost estimates. Section 3 also provides a summary of the total electrical drive train cost for the Camry hybrid vehicle and contrasts these estimates to the costs for the second generation Prius that we estimated in 2005. Most of the information on cost and performance were derived from meetings with the technical staff of Toyota, Nissan, and some key Tier I suppliers like Hitachi and Panasonic Electric Vehicle Energy (PEVE) and we thank these companies for their kind cooperation.
Date: September 30, 2007
Partner: UNT Libraries Government Documents Department

Long-range and head-on beam-beam compensation studies in RHIC with lessons for the LHC

Description: Long-range as well as head-on beam-beam effects are expected to limit the LHC performance with design parameters. They are also important consideration for the LHC upgrades. To mitigate long-range effects, current carrying wires parallel to the beam were proposed. Two such wires are installed in RHIC where they allow studying the effect of strong long-range beam-beam effects, as well as the compensation of a single long-range interaction. The tests provide benchmark data for simulations and analytical treatments. Electron lenses were proposed for both RHIC and the LHC to reduce the head-on beam-beam effect. We present the experimental long-range beam-beam program at RHIC and report on head-on compensations studies based on simulations.
Date: January 12, 2009
Creator: Fischer, W.; Luo, Y.; Abreu, N.; Calaga, R.; Montag, C.; Robert-Demolaize, G. et al.
Partner: UNT Libraries Government Documents Department

What Scientific Applications can Benefit from Hardware Transactional Memory?

Description: Achieving efficient and correct synchronization of multiple threads is a difficult and error-prone task at small scale and, as we march towards extreme scale computing, will be even more challenging when the resulting application is supposed to utilize millions of cores efficiently. Transactional Memory (TM) is a promising technique to ease the burden on the programmer, but only recently has become available on commercial hardware in the new Blue Gene/Q system and hence the real benefit for realistic applications has not been studied, yet. This paper presents the first performance results of TM embedded into OpenMP on a prototype system of BG/Q and characterizes code properties that will likely lead to benefits when augmented with TM primitives. We first, study the influence of thread count, environment variables and memory layout on TM performance and identify code properties that will yield performance gains with TM. Second, we evaluate the combination of OpenMP with multiple synchronization primitives on top of MPI to determine suitable task to thread ratios per node. Finally, we condense our findings into a set of best practices. These are applied to a Monte Carlo Benchmark and a Smoothed Particle Hydrodynamics method. In both cases an optimized TM version, executed with 64 threads on one node, outperforms a simple TM implementation. MCB with optimized TM yields a speedup of 27.45 over baseline.
Date: June 4, 2012
Creator: Schindewolf, M; Bihari, B; Gyllenhaal, J; Schulz, M; Wang, A & Karl, W
Partner: UNT Libraries Government Documents Department

AutomaDeD: Automata-Based Debugging for Dissimilar Parallel Tasks

Description: Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.
Date: March 23, 2010
Creator: Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D & Schulz, M
Partner: UNT Libraries Government Documents Department

A New Stabilized Nodal Integration Approach

Description: A new stabilized nodal integration scheme is proposed and implemented. In this work, focus is on the natural neighbor meshless interpolation schemes. The approach is a modification of the stabilized conforming nodal integration (SCNI) scheme and is shown to perform well in several benchmark problems.
Date: February 8, 2006
Creator: Puso, M; Zywicz, E & Chen, J S
Partner: UNT Libraries Government Documents Department

GMG: A Guaranteed, Efficient Global Optimization Algorithm for Remote Sensing.

Description: The monocular passive ranging (MPR) problem in remote sensing consists of identifying the precise range of an airborne target (missile, plane, etc.) from its observed radiance. This inverse problem may be set as a global optimization problem (GOP) whereby the difference between the observed and model predicted radiances is minimized over the possible ranges and atmospheric conditions. Using additional information about the error function between the predicted and observed radiances of the target, we developed GMG, a new algorithm to find the Global Minimum with a Guarantee. The new algorithm transforms the original continuous GOP into a discrete search problem, thereby guaranteeing to find the position of the global minimum in a reasonably short time. The algorithm is first applied to the golf course problem, which serves as a litmus test for its performance in the presence of both complete and degraded additional information. GMG is further assessed on a set of standard benchmark functions and then applied to various realizations of the MPR problem.
Date: August 18, 2004
Creator: D'Helon, CD
Partner: UNT Libraries Government Documents Department

An Improved Linear Tetrahedral Element for Plasticity

Description: A stabilized, nodally integrated linear tetrahedral is formulated and analyzed. It is well known that linear tetrahedral elements perform poorly in problems with plasticity, nearly incompressible materials, and acute bending. For a variety of reasons, linear tetrahedral elements are preferable to quadratic tetrahedral elements in most nonlinear problems. Whereas, mixed methods work well for linear hexahedral elements, they don't for linear tetrahedrals. On the other hand, automatic mesh generation is typically not feasible for building many 3D hexahedral meshes. A stabilized, nodally integrated linear tetrahedral is developed and shown to perform very well in problems with plasticity, nearly incompressible materials and acute bending. Furthermore, the formulation is analytically and numerically shown to be stable and optimally convergent. The element is demonstrated to perform well in several standard linear and nonlinear benchmarks.
Date: April 25, 2005
Creator: Puso, M
Partner: UNT Libraries Government Documents Department