653 Matching Results

Search Results

Advanced search parameters have been applied.

Hospital Energy Benchmarking Guidance - Version 1.0

Description: This document describes an energy benchmarking framework for hospitals. The document is organized as follows. The introduction provides a brief primer on benchmarking and its application to hospitals. The next two sections discuss special considerations including the identification of normalizing factors. The presentation of metrics is preceded by a description of the overall framework and the rationale for the grouping of metrics. Following the presentation of metrics, a high-level protocol is provided. The next section presents draft benchmarks for some metrics; benchmarks are not available for many metrics owing to a lack of data. This document ends with a list of research needs for further development.
Date: September 8, 2009
Creator: Singer, Brett C.
Partner: UNT Libraries Government Documents Department

Holographic Protection of Chronology in Universes of the Godel Type

Description: We analyze the structure of supersymmetric Godel-like cosmological solutions of string theory. Just as the original four-dimensional Godel universe, these solutions represent rotating, topologically trivial cosmologies with a homogeneous metric and closed timelike curves. First we focus on"phenomenological" aspects of holography, and identify the preferred holographic screens associated with inertial comoving observers in Godel universes. We find that holography can serve as a chronology protection agency: The closed timelike curves are either hidden behind the holographic screen, or broken by it into causal pieces. In fact, holography in Godel universes has many features in common with de Sitter space, suggesting that Godel universes could represent a supersymmetric laboratory for addressing the conceptual puzzles of de Sitter holography. Then we initiate the investigation of"microscopic" aspects of holography of Godel universes in string theory. We show that Godel universes are T-dual to pp-waves, and use this fact to generate new Godel-like solutions of string and M-theory by T-dualizing known supersymmetric pp-wave solutions.
Date: December 7, 2002
Creator: Boyda, Edward; Ganguli, Surya; Horava, Petr & Varadarajan, Uday
Partner: UNT Libraries Government Documents Department

Topological Landscapes: A Terrain Metaphor for ScientificData

Description: Scientific visualization and illustration tools are designed to help people understand the structure and complexity of scientific data with images that are as informative and intuitive as possible. In this context, the use of metaphors plays an important role, since they make complex information easily accessible by using commonly known concepts. In this paper we propose a new metaphor, called 'Topological Landscapes', which facilitates understanding the topological structure of scalar functions. The basic idea is to construct a terrain with the same topology as a given dataset and to display the terrain as an easily understood representation of the actual input data. In this projection from an n-dimensional scalar function to a two-dimensional (2D) model we preserve function values of critical points, the persistence (function span) of topological features, and one possible additional metric property (in our examples volume). By displaying this topologically equivalent landscape together with the original data we harness the natural human proficiency in understanding terrain topography and make complex topological information easily accessible.
Date: August 1, 2007
Creator: Weber, Gunther H.; Bremer, Peer-Timo & Pascucci, Valerio
Partner: UNT Libraries Government Documents Department

Metrics Evolution in an Energy Research & Development Program

Description: All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R&D Program, which is working to improve the sustainability of nuclear energy.
Date: August 1, 2011
Creator: Dixon, Brent
Partner: UNT Libraries Government Documents Department

Probing the Geometry of Warped String Compactifications at the LHC

Description: Warped string compactifications, characterized by the nonsingular behavior of the metric in the infrared (IR), feature departures from the usual anti?de Sitter warped extra dimensions. We study the implications of the smooth IR cutoff for Randall-Sundrum- (RS-)type models. We find that the phenomenology of the Kaluza-Klein gravitons (including their masses and couplings) depends sensitively on the precise shape of the warp factor in the IR. In particular, we analyze the warped deformed conifold, find that the spectrum differs significantly from that of RS, and present a simple prescription (a mass-gap ansatz) that can be used to study the phenomenology of IR modifications to 5D warped extra dimensions.
Date: May 28, 2007
Creator: Walker, Devin; Shiu, Gary; Underwood, Bret; Zurek, Kathryn M. & Walker, Devin G. E.
Partner: UNT Libraries Government Documents Department

Fast marching methods for the continuous traveling salesman problem

Description: We consider a problem in which we are given a domain, a cost function which depends on position at each point in the domain, and a subset of points ('cities') in the domain. The goal is to determine the cheapest closed path that visits each city in the domain once. This can be thought of as a version of the Traveling Salesman Problem, in which an underlying known metric determines the cost of moving through each point of the domain, but in which the actual shortest path between cities is unknown at the outset. We describe algorithms for both a heuristic and an optimal solution to this problem. The order of the heuristic algorithm is at worst case M * N logN, where M is the number of cities, and N the size of the computational mesh used to approximate the solutions to the shortest paths problems. The average runtime of the heuristic algorithm is linear in the number of cities and O(N log N) in the size N of the mesh.
Date: December 1, 2008
Creator: Andrews, J. & Sethian, J.A.
Partner: UNT Libraries Government Documents Department

Experimental and model-based study of the robustness of line-edgeroughness metric extraction in the presence of noise

Description: As critical dimensions shrink, line edge and width roughness (LER and LWR) become of increasing concern. Crucial to the goal of reducing LER is its accurate characterization. LER has traditionally been represented as a single rms value. More recently the use of power spectral density (PSD), height-height correlation (HHCF), and {sigma} versus length plots has been proposed in order to extract the additional spatial descriptors of correlation length and roughness exponent. Here we perform a modeling-based noise-sensitivity study on the extraction of spatial descriptors from line-edge data as well as an experimental study of the robustness of these various descriptors using a large dataset of recent extreme-ultraviolet exposure data. The results show that in the presence of noise and in the large dataset limit, the PSD method provides higher accuracy in the extraction of the roughness exponent, whereas the HHCF method provides higher accuracy for the correlation length. On the other hand, when considering precision, the HHCF method is superior for both metrics.
Date: June 1, 2007
Creator: Naulleau, Patrick P. & Cain, Jason P.
Partner: UNT Libraries Government Documents Department

R-LODs: Fast LOD-Based Ray Tracing of Massive Models

Description: We present a novel LOD (level-of-detail) algorithm to accelerate ray tracing of massive models. Our approach computes drastic simplifications of the model and the LODs are well integrated with the kd-tree data structure. We introduce a simple and efficient LOD metric to bound the error for primary and secondary rays. The LOD representation has small runtime overhead and our algorithm can be combined with ray coherence techniques and cache-coherent layouts to improve the performance. In practice, the use of LODs can alleviate aliasing artifacts and improve memory coherence. We implement our algorithm on both 32bit and 64bit machines and able to achieve up to 2.20 times improvement in frame rate of rendering models consisting of tens or hundreds of millions of triangles with little loss in image quality.
Date: February 14, 2006
Creator: Yoon, S; Lauterbach, C & Manocha, D
Partner: UNT Libraries Government Documents Department

Sensitivity study of reliable, high-throughput resolution metricsfor photoresists

Description: The resolution of chemically amplified resists is becoming an increasing concern, especially for lithography in the extreme ultraviolet (EUV) regime. Large-scale screening and performance-based down-selection is currently underway to identify resist platforms that can support shrinking feature sizes. Resist screening efforts, however, are hampered by the absence of reliable resolution metrics that can objectively quantify resist resolution in a high-throughput fashion. Here we examine two high-throughput metrics for resist resolution determination. After summarizing their details and justifying their utility, we characterize the sensitivity of both metrics to two of the main experimental uncertainties associated with lithographic exposure tools, namely: limited focus control and limited knowledge of optical aberrations. For an implementation at EUV wavelengths, we report aberration and focus limited error bars in extracted resolution of {approx} 1.25 nm RMS for both metrics making them attractive candidates for future screening and down-selection efforts.
Date: July 30, 2007
Creator: Anderson, Christopher N. & Naulleau, Patrick P.
Partner: UNT Libraries Government Documents Department

Identification of random variation in structures and their parameter estimates.

Description: Structures that are members of an ensemble of nominally identical systems actually differ due to variations in details among individuals. Furthermore, there are variations in the system response of an individual structure that can be attributed to unmeasured conditions (such as temperature and humidity) that are present during experiments. Finally, noise is present in all measurements of structural excitations and responses. For these reasons, there is always random variation associated with the characterizations of structural dynamic systems, and descriptions of results must be in statistical or probabilistic terms;. This study identifies and assesses the sources and the degrees of randomness in a metric of structural dynamics of a given system through experiments and analysis.
Date: January 1, 2002
Creator: Farrar, C. R. (Charles R.); Aumann, R. J. (Richard J.); McCarty, A. A. (Amanda A.) & Olson, C. C. (Colin C.)
Partner: UNT Libraries Government Documents Department

Performance-based assessment of reconstructed images

Description: During the early 90s, I engaged in a productive and enjoyable collaboration with Robert Wagner and his colleague, Kyle Myers. We explored the ramifications of the principle that tbe quality of an image should be assessed on the basis of how well it facilitates the performance of appropriate visual tasks. We applied this principle to algorithms used to reconstruct scenes from incomplete and/or noisy projection data. For binary visual tasks, we used both the conventional disk detection and a new challenging task, inspired by the Rayleigh resolution criterion, of deciding whether an object was a blurred version of two dots or a bar. The results of human and machine observer tests were summarized with the detectability index based on the area under the ROC curve. We investigated a variety of reconstruction algorithms, including ART, with and without a nonnegativity constraint, and the MEMSYS3 algorithm. We concluded that the performance of the Raleigh task was optimized when the strength of the prior was near MEMSYS's default 'classic' value for both human and machine observers. A notable result was that the most-often-used metric of rms error in the reconstruction was not necessarily indicative of the value of a reconstructed image for the purpose of performing visual tasks.
Date: January 1, 2009
Creator: Hanson, Kenneth
Partner: UNT Libraries Government Documents Department

Automated measurement of quality of mucosa inspection for colonscopy

Description: This paper from the International Conference on Computational Science conference proceedings presents new methods that derive a new quality metric for automated scoring of quality of mucosa inspection performed by the endoscopist.
Date: May 31, 2010
Creator: Liu, Xuemin; Tavanapong, Wallapak; Wong, Johnny; Oh, JungHwan & de Groen, Piet C.
Partner: UNT College of Engineering

Modeling the Office of Science Ten Year Facilities Plan: The PERI Architecture Tiger Team

Description: The Performance Engineering Institute (PERI) originally proposed a tiger team activity as a mechanism to target significant effort optimizing key Office of Science applications, a model that was successfully realized with the assistance of two JOULE metric teams. However, the Office of Science requested a new focus beginning in 2008: assistance in forming its ten year facilities plan. To meet this request, PERI formed the Architecture Tiger Team, which is modeling the performance of key science applications on future architectures, with S3D, FLASH and GTC chosen as the first application targets. In this activity, we have measured the performance of these applications on current systems in order to understand their baseline performance and to ensure that our modeling activity focuses on the right versions and inputs of the applications. We have applied a variety of modeling techniques to anticipate the performance of these applications on a range of anticipated systems. While our initial findings predict that Office of Science applications will continue to perform well on future machines from major hardware vendors, we have also encountered several areas in which we must extend our modeling techniques in order to fulfill our mission accurately and completely. In addition, we anticipate that models of a wider range of applications will reveal critical differences between expected future systems, thus providing guidance for future Office of Science procurement decisions, and will enable DOE applications to exploit machines in future facilities fully.
Date: June 26, 2009
Creator: de Supinski, Bronis R.; Alam, Sadaf; Bailey, David H.; Carrington, Laura; Daley, Chris; Dubey, Anshu et al.
Partner: UNT Libraries Government Documents Department

A comparison of water vapor quantities from model short-range forecasts and ARM observations

Description: Model evolution and improvement is complicated by the lack of high quality observational data. To address a major limitation of these measurements the Atmospheric Radiation Measurement (ARM) program was formed. For the second quarter ARM metric we will make use of new water vapor data that has become available, and called the 'Merged-sounding' value added product (referred to as OBS, within the text) at three sites: the North Slope of Alaska (NSA), Darwin Australia (DAR) and the Southern Great Plains (SGP) and compare these observations to model forecast data. Two time periods will be analyzed March 2000 for the SGP and October 2004 for both DAR and NSA. The merged-sounding data have been interpolated to 37 pressure levels (e.g., from 1000hPa to 100hPa at 25hPa increments) and time averaged to 3 hourly data for direct comparison to our model output.
Date: March 17, 2006
Creator: Hnilo, J J
Partner: UNT Libraries Government Documents Department

Engineering index : a metric for assessing margin in engineered systems

Description: Inherent in most engineered products is some measure of margin or over design. Engineers often do not retain design and performance knowledge so they can quantify uncertainties and estimate how much margin their product possesses. When knowledge-capture and quantification is neither possible, nor permissible, engineers rely on cultural lore and institutionalised practices to assign nominal conditions and tolerances. Often what gets lost along the way is design intent, product requirements, and their relationship with the product's intended application. The Engineering Index was developed to assess the goodness or quality of a product.
Date: January 1, 2002
Creator: Dolin, Ronald M.
Partner: UNT Libraries Government Documents Department

Extreme ultraviolet mask substrate surface roughness effects on lithography patterning

Description: In extreme ultraviolet lithography exposure systems, mask substrate roughness induced scatter contributes to LER at the image plane. In this paper, the impact of mask substrate roughness on image plane speckle is explicitly evaluated. A programmed roughness mask was used to study the correlation between mask roughness metrics and wafer plane aerial image inspection. We find that the roughness measurements by top surface topography profile do not provide complete information on the scatter related speckle that leads to LER at the image plane. We suggest at wavelength characterization by imaging and/or scatter measurements into different frequencies as an alternative for a more comprehensive metrology of the mask substrate/multilayer roughness effects.
Date: June 21, 2010
Creator: George, Simi; Naulleau, Patrick; Salmassi, Farhad; Mochi, Iacopo; Gullikson, Eric; Goldberg, Kenneth et al.
Partner: UNT Libraries Government Documents Department

FY 2010 Second Quarter Report Evaluation of the Liu-Daum-McGraw (LDM) Drizzle Threshold Parameterization using Measurements from the VAMOS Ocean-Cloud-Atmosphere Land Study (VOCALS) Field Campaign

Description: Metric for Quarter 2: Evaluate LDM (Liu, Daum, McGraw) drizzle threshold parameterization for a range of cloud conditions by comparing the threshold function computed using measurements of cloud droplet number concentration and cloud liquid water content to measurements of drizzle droplet number concentrations and/or drizzle water content.
Date: April 4, 2011
Creator: McGraw, R; Kleinman, LI; Springston, SR; Daum, PH; Senum, G & Wang, J
Partner: UNT Libraries Government Documents Department

Performance and scaling of locally-structured grid methods forpartial differential equations

Description: In this paper, we discuss some of the issues in obtaining high performance for block-structured adaptive mesh refinement software for partial differential equations. We show examples in which AMR scales to thousands of processors. We also discuss a number of metrics for performance and scalability that can provide a basis for understanding the advantages and disadvantages of this approach.
Date: July 19, 2007
Creator: Colella, Phillip; Bell, John; Keen, Noel; Ligocki, Terry; Lijewski, Michael & Van Straalen, Brian
Partner: UNT Libraries Government Documents Department

Influence of base and PAG on deprotection blur in EUV photoresists and some thoughts on shot noise

Description: A contact-hole deprotection blur metric has been used to monitor the deprotection blur of an experimental open platform resist (EH27) as the weight percent of base and photo acid generator (PAG) were varied. A 6x increase in base weight percent is shown to reduce the size of successfully patterned 1:1 line-space features from 52 nm to 39 nm without changing deprotection blur. Corresponding isolated line-edge-roughness is reduced from 6.9 nm to 4.1 nm. A 2x increase in PAG weight percent is shown to improve 1:1 line-space patterning from 47 nm to 40 nm without changing deprotection blur or isolated LER. A discussion of improved patterning performance as related to shot noise and deprotection blur concludes with a speculation that the spatial distribution of PAG molecules has been playing some role, perhaps a dominant one, in determining the uniformity of photo generated acids in the resists that have been studied.
Date: June 1, 2008
Creator: Anderson, Christopher N.; Naulleau, Patrick P.; Niakoula, Dimitra; Hassanein, Elsayed; Brainard, Robert; Gallatin, Gregg et al.
Partner: UNT Libraries Government Documents Department

Scheduling in Heterogeneous Grid Environments: The Effects of DataMigration

Description: Computational grids have the potential for solving large-scale scientific problems using heterogeneous and geographically distributed resources. However, a number of major technical hurdles must be overcome before this goal can be fully realized. One problem critical to the effective utilization of computational grids is efficient job scheduling. Our prior work addressed this challenge by defining a grid scheduling architecture and several job migration strategies. The focus of this study is to explore the impact of data migration under a variety of demanding grid conditions. We evaluate our grid scheduling algorithms by simulating compute servers, various groupings of servers into sites, and inter-server networks, using real workloads obtained from leading supercomputing centers. Several key performance metrics are used to compare the behavior of our algorithms against reference local and centralized scheduling schemes. Results show the tremendous benefits of grid scheduling, even in the presence of input/output data migration - while highlighting the importance of utilizing communication-aware scheduling schemes.
Date: January 1, 2004
Creator: Oliker, Leonid; Biswas, Rupak; Shan, Hongzhang & Smith, Warren
Partner: UNT Libraries Government Documents Department

Transitive closure and metric inequality of weighted graphs:detecting protein interaction modules using cliques

Description: We study transitivity properties of edge weights in complex networks. We show that enforcing transitivity leads to a transitivity inequality which is equivalent to ultra-metric inequality. This can be used to define transitive closure on weighted undirected graphs, which can be computed using a modified Floyd-Warshall algorithm. We outline several applications and present results of detecting protein functional modules in a protein interaction network.
Date: June 2, 2006
Creator: Ding, Chris; He, Xiaofeng; Xiong, Hui; Peng, Hanchuan & Holbrook,Stephen R.
Partner: UNT Libraries Government Documents Department