1,780 Matching Results

Search Results

Advanced search parameters have been applied.

Historical background: Why is it important to improve automated particle selection methods?

Description: A current trend in single-particle electron microscopy is to compute three-dimensional reconstructions with ever-increasing numbers of particles in the data sets. Since manual--or even semi-automated--selection of particles represents a major bottleneck when the data set exceeds several thousand particles, there is growing interest in developing automatic methods for selecting images of individual particles. Except in special cases, however, it has proven difficult to achieve the degree of efficiency and reliability that would make fully automated particle selection a useful tool. The simplest methods such as cross correlation (i.e., matched filtering) do not perform well enough to be used for fully automated particle selection. Geometric properties (area, perimeter-to-area ratio, etc.) and the integrated ''mass'' of candidate particles are additional factors that could improve automated particle selection if suitable methods of contouring particles could be developed. Another suggestion is that data be always collected as pairs of images, the first taken at low defocus (to capture information at the highest possible resolution) and the second at very high defocus (to improve the visibility of the particle). Finally, it is emphasized that well-annotated, open-access data sets need to be established in order to encourage the further development and validation of methods for automated particle selection.
Date: August 14, 2003
Creator: Glaeser, Robert M.
Partner: UNT Libraries Government Documents Department

The mean evolution and variability of the Asian summer monsoon: comparison of ECMWF and NCEP/NCAR reanalyses

Description: The behavior of the Asian Summer Monsoon is compared using the European Centre for Medium Range Weather Forecasts Reanalysis (ERA) and the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis (Kalnay et al. 1996). The goals of this paper are to identify common features between the reanalyses, to assess their robustness for model validation, and especially to use reanalyses to develop their understanding of the mean evolution of the Asian Summer Monsoon and the characteristics of its interannual and intraseasonal variability (Annamalai et al. 1999).
Date: April 21, 1999
Creator: Annamalai, H.; Hodges, K.; Slingo, J.M. & Sperber, K.R.
Partner: UNT Libraries Government Documents Department

Spent Nuclear Fuel (SNF) Process Validation Technical Support Plan

Description: The purpose of Process Validation is to confirm that nominal process operations are consistent with the expected process envelope. The Process Validation activities described in this document are not part of the safety basis, but are expected to demonstrate that the process operates well within the safety basis. Some adjustments to the process may be made as a result of information gathered in Process Validation.
Date: March 13, 2000
Creator: SEXTON, R.A.
Partner: UNT Libraries Government Documents Department

Weatherford Inclined Wellbore Construction

Description: The Rocky Mountain Oilfield Testing Center (RMOTC) has recently completed construction of an inclined wellbore with seven (7) inch, twenty-three (23) pound casing at a total depth of 1296 feet. The inclined wellbore is near vertical to 180 feet with a build angle of approximately 4.5 degrees per hundred feet thereafter. The inclined wellbore was utilized for further proprietary testing after construction and validation. The wellbore is available to other companies requiring a cased hole environment with known deviation out to fifty degrees (50) from vertical. The wellbore may also be used by RMOTC for further deepening into the fractured shales of the Steele and Niobrara formation.
Date: August 19, 2002
Creator: Schulte, R.
Partner: UNT Libraries Government Documents Department

Validation Report for FY 1997--Final Report

Description: The report issued according to ''Work Release 02. P. 99-8'' presents a comparison of results on VVER Calculational Benchmarks computed with various codes: design code TVS-M and precision code MCU-REA elaborated in RRC KI, IPPE codes WIMS-ABBN, TRIANG-PWR and CONKEMO and 2-D fuel assembly analysis code HELIOS developed by Studsvik Scandpower.
Date: September 28, 2001
Creator: Pavlovichev, A.M.
Partner: UNT Libraries Government Documents Department

Validation and Evaluation of Emergency Response Plans through Agent-Based Modeling and Simulation

Description: Biological emergency response planning plays a critical role in protecting the public from possible devastating results of sudden disease outbreaks. These plans describe the distribution of medical countermeasures across a region using limited resources within a restricted time window. Thus, the ability to determine that such a plan will be feasible, i.e. successfully provide service to affected populations within the time limit, is crucial. Many of the current efforts to validate plans are in the form of live drills and training, but those may not test plan activation at the appropriate scale or with sufficient numbers of participants. Thus, this necessitates the use of computational resources to aid emergency managers and planners in developing and evaluating plans before they must be used. Current emergency response plan generation software packages such as RE-PLAN or RealOpt, provide rate-based validation analyses. However, these types of analysis may neglect details of real-world traffic dynamics. Therefore, this dissertation presents Validating Emergency Response Plan Execution Through Simulation (VERPETS), a novel, computational system for the agent-based simulation of biological emergency response plan activation. This system converts raw road network, population distribution, and emergency response plan data into a format suitable for simulation, and then performs these simulations using SUMO, or Simulations of Urban Mobility, to simulate realistic traffic dynamics. Additionally, high performance computing methodologies were utilized to decrease agent load on simulations and improve performance. Further strategies, such as use of agent scaling and a time limit on simulation execution, were also examined. Experimental results indicate that the time to plan completion, i.e. the time when all individuals of the population have received medication, determined by VERPETS aligned well with current alternate methodologies. It was determined that the dynamic of traffic congestion at the POD itself was one of the major factors affecting the completion time of ...
Date: May 2018
Creator: Helsing, Joseph
Partner: UNT Libraries

Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

Description: Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes – PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS – that could be used for design simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the “Report of the 2003 RIA R&D Workshop".
Date: June 7, 2013
Creator: Ronningen, Reginald Martin; Remec, Igor & Heilbronn, Lawrence H.
Partner: UNT Libraries Government Documents Department

CdTe Feedstock Development and Validation: Cooperative Research and Development Final Report, CRADA Number CRD-08-00280

Description: The goal of this work was to evaluate different CdTe feedstock formulations (feedstock provided by Redlen) to determine if they would significantly improve CdTe performance with ancillary benefits associated with whether changes in feedstock would affect CdTe cell processing and possibly reliability of cells. Feedstock also included attempts to intentionally dope the CdTe with pre-selected elements.
Date: May 1, 2011
Creator: Albin, D.
Partner: UNT Libraries Government Documents Department

Post-processing V&V Level II ASC Milestone (2843) results.

Description: The 9/30/2008 ASC Level 2 Post-Processing V&V Milestone (Milestone 2843) contains functionality required by the user community for certain verification and validation tasks. These capabilities include fragment detection from CTH simulation data, fragment characterization and analysis, and fragment sorting and display operations. The capabilities were tested extensively both on sample and actual simulations. In addition, a number of stretch criteria were met including a comparison between simulated and test data, and the ability to output each fragment as an individual geometric file.
Date: October 1, 2008
Creator: Karelitz, David B.; Ice, Lisa G.; Wilke, Jason; Moreland, Kenneth D. & Attaway, Stephen W.
Partner: UNT Libraries Government Documents Department

Relation of validation experiments to applications.

Description: Computational and mathematical models are developed in engineering to represent the behavior of physical systems to various system inputs and conditions. These models are often used to predict at other conditions, rather than to just reproduce the behavior of data obtained at the experimental conditions. For example, the boundary or initial conditions, time of prediction, geometry, material properties, and other model parameters can be different at test conditions than those for an anticipated application of a model. Situations for which the conditions may differ include those for which (1) one is in the design phase and a prototype of the system has not been constructed and tested under the anticipated conditions, (2) only one version of a final system can be built and destructive testing is not feasible, or (3) the anticipated design conditions are variable and one cannot easily reproduce the range of conditions with a limited number of carefully controlled experiments. Because data from these supporting experiments have value in model validation, even if the model was tested at different conditions than an anticipated application, methodology is required to evaluate the ability of the validation experiments to resolve the critical behavior for the anticipated application. The methodology presented uses models for the validation experiments and a model for the application to address how well the validation experiments resolve the application. More specifically, the methodology investigates the tradeoff that exists between the uncertainty (variability) in the behavior of the resolved critical variables for the anticipated application and the ability of the validation experiments to resolve this behavior. The important features of this approach are demonstrated through simple linear and non-linear heat conduction examples.
Date: February 1, 2009
Creator: Hamilton, James R. (New Mexico State University, Las Cruces, NM) & Hills, Richard Guy
Partner: UNT Libraries Government Documents Department

Refinements to the Boolean approach to automatic data editing

Description: Automatic data editing consists of three components: identification of erroneous records, identification of most likely erroneous fields within an erroneous record (fields to impute), and assignment of acceptable values to failing records. Moreover the types of data considered naturally fall into three categories: coded (categorical) data, continuous data, and mixed data (both coded and continuous). For the case of coded data, a natural way to approach automatic data is commonly referred to as the Boolean approach, first developed by Fellegi and Holt. For the fields to impute problem, central to the operation of the Fellegi-Holt approach is the explicit recognition of certain implied edits; Fellegi and Holt orginally required a complete set of edits, and their algorithm to generate this complete set has occasionally had the distinct disadvantage of failing to converge within reasonable time. The primary results of this paper is an algorithm that significantly prunes the Fellegi-Holt edit generation process, yet, nonetheless, generates a sufficient collection of implied edits adequate for the solution of the fields to impute problem. 3 figures.
Date: September 1, 1980
Creator: Liepins, G.E.
Partner: UNT Libraries Government Documents Department

Rigorous, systematic approach to automatic data editing and its statistical basis

Description: Automation data editing is the computerized identification and correction (optional) of data errors. These techniques can provide error statistics that indicate the frequency of various types of data errors, diagnostic information that aids in identifying inadequacies in the data collection system, and a clean data base appropriate for use in further decision making, in modeling, and for inferential purposes. However, before these numerous benefits can be fully realized, certain research problems need to be resolved, and the linkage between statistical error analysis and extreme-value programing needs to be carefully determined. The linkage is provided here for the special case that certain independence and symmetry conditions obtain; also provided are rigorous proofs of results central to the functioning of the Boolean approach to automatic data editing of coded (categorical) data. In particular, sufficient collections of edits are defined, and it is shown that for a fixed objective function the solution to the fields to impute problem is obtainable simply from knowing which edits of the sufficient collection are failed, and this solution is invariant of the particular sufficient collection of edits identified. Similarly, disjoint-sufficient collections of edits are defined, and it is shown that, if the objective function of the fields to impute problem is determined by what Freund and Hartley call the number of involvements in unsatisfied consistency checks, then the objective function will be independent of the disjoint-sufficient collection of edits used.
Date: January 1, 1981
Creator: Liepins, G.E.
Partner: UNT Libraries Government Documents Department

Demonstration and Validation Assets: User Manual Development

Description: This report documents the development of a database-supported user manual for DEMVAL assets in the NSTI area of operations and focuses on providing comprehensive user information on DEMVAL assets serving businesses with national security technology applications in southern New Mexico. The DEMVAL asset program is being developed as part of the NSPP, funded by both Department of Energy (DOE) and NNSA. This report describes the development of a comprehensive user manual system for delivering indexed DEMVAL asset information to be used in marketing and visibility materials and to NSTI clients, prospective clients, stakeholders, and any person or organization seeking it. The data about area DEMVAL asset providers are organized in an SQL database with updateable application structure that optimizes ease of access and customizes search ability for the user.
Date: June 30, 2008
Partner: UNT Libraries Government Documents Department

Using Patterns for Multivariate Monitoring and Feedback Control of Linear Accelerator Performance: Proof-of-Concept Research

Description: The report discusses preliminary proof-of-concept research for using the Advanced Data Validation and Verification System (ADVVS), a new INEEL software package, to add validation and verification and multivariate feedback control to the operation of non-destructive analysis (NDA) equipment. The software is based on human cognition, the recognition of patterns and changes in patterns in time-related data. The first project applied ADVVS to monitor operations of a selectable energy linear electron accelerator, and showed how the software recognizes in real time any deviations from the optimal tune of the machine. The second project extended the software method to provide model-based multivariate feedback control for the same linear electron accelerator. The projects successfully demonstrated proof-of-concept for the applications and focused attention on the common application of intelligent information processing techniques.
Date: April 1, 2002
Creator: Cordes, Gail Adele; Van Ausdeln, Leo Anthony & Velasquez, Maria Elena
Partner: UNT Libraries Government Documents Department

Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project: Fall 2006 Progress Update (Presentation)

Description: This presentation, given by NREL's Keith Wipke at EVS-22, provides an update on the Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project.
Date: October 26, 2006
Creator: Wipke, K.; Welch, C.; Thomas, H.; Sprik, S.; Gronich, S. & Garbak, J.
Partner: UNT Libraries Government Documents Department

Real-World Hydrogen Technology Validation: Preprint

Description: The Department of Energy, the Department of Defense's Defense Logistics Agency, and the Department of Transportation's Federal Transit Administration have funded learning demonstrations and early market deployments to provide insight into applications of hydrogen technologies on the road, in the warehouse, and as stationary power. NREL's analyses validate the technology in real-world applications, reveal the status of the technology, and facilitate the development of hydrogen and fuel cell technologies, manufacturing, and operations. This paper presents the maintenance, safety, and operation data of fuel cells in multiple applications with the reported incidents, near misses, and frequencies. NREL has analyzed records of more than 225,000 kilograms of hydrogen that have been dispensed through more than 108,000 hydrogen fills with an excellent safety record.
Date: March 1, 2012
Creator: Sprik, S.; Kurtz, J.; Wipke, K.; Ramsden, T.; Ainscough, C.; Eudy, L. et al.
Partner: UNT Libraries Government Documents Department

Post-processing V&V level II ASC milestone (2360) results.

Description: The 9/30/2007 ASC Level 2 Post-Processing V&V Milestone (Milestone 2360) contains functionality required by the user community for certain verification and validation tasks. These capabilities include loading of edge and face data on an Exodus mesh, run-time computation of an exact solution to a verification problem, delivery of results data from the server to the client, computation of an integral-based error metric, simultaneous loading of simulation and test data, and comparison of that data using visual and quantitative methods. The capabilities were tested extensively by performing a typical ALEGRA HEDP verification task. In addition, a number of stretch criteria were met including completion of a verification task on a 13 million element mesh.
Date: September 1, 2007
Creator: Chavez, Elmer; Karelitz, David B.; Brunner, Thomas A.; Trucano, Timothy Guy; Moreland, Kenneth D.; Weirs, V. Gregory et al.
Partner: UNT Libraries Government Documents Department

CFD INVESTIGATION OF EXPERIMENTAL DATA PROPOSED TO BE A VALIDATION DATA SET

Description: The U. S. Department of Energy (DOE) is currently supporting the development of a next generation nuclear plant (NGNP). The NGNP is based on the very high temperature reactor (VHTR), which is a Gen. IV gas-cooled reactor concept that will use helium as the coolant. Computational fluid dynamics (CFD) calculations are to be employed to estimate the details of the flow and heat transfer in the lower plenum where the heated coolant empties before exiting the reactor vessel. While it is expected that CFD will be able to provide detailed information about the flow, it must be validated using experimental data. Detailed experimental data have been taken in the INL’s matched index of refraction (MIR) facility of a scaled model of a section of the prismatic VHTR lower plenum. The present article examines the data that were taken to determine the suitability of such data to be a validation data set for CFD calculations. CFD calculations were made to compare with the experimental data to explore potential issues and make recommendations regarding the MIR data.
Date: July 1, 2009
Creator: Johnson, Richard W.
Partner: UNT Libraries Government Documents Department

Validation experiments to determine radiation partitioning of heat flux to an object in a fully turbulent fire.

Description: It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heat (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.
Date: June 1, 2006
Creator: Ricks, Allen; Blanchat, Thomas K. & Jernigan, Dann A.
Partner: UNT Libraries Government Documents Department

Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

Description: The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.
Date: March 1, 2005
Creator: Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G. & Black, Amalia Rebecca
Partner: UNT Libraries Government Documents Department

Development and Validation of WECC Variable Speed Wind Turbine Dynamic Models for Grid Integration Studies

Description: This paper describes reduced-order, simplified wind turbine models for analyzing the stability impact of large arrays of wind turbines with a single point of network interconnection.
Date: September 1, 2007
Creator: Behnke, M.; Ellis, A.; Kazachkov, Y.; McCoy, T.; Muljadi, E.; Price, W. et al.
Partner: UNT Libraries Government Documents Department