2,008 Matching Results

Search Results

Advanced search parameters have been applied.

Finite element modeling for validation of structural damage identification experimentation.

Description: The project described in this report was performed to couple experimental and analytical techniques in the field of structural health monitoring and darnage identification. To do this, a finite dement model was Constructed of a simulated three-story building used for damage identification experiments. The model was used in conjunction with data from thie physical structure to research damage identification algorithms. Of particular interest was modeling slip in joints as a function of bolt torque and predicting the smallest change of torque that could be detected experimentally. After being validated with results from the physical structure, the model was used to produce data to test the capabilities of damage identification algorithms. This report describes the finite element model constructed, the results obtained, and proposed future use of the model.
Date: January 1, 2001
Creator: Stinemates, D. W. (Daniel W.) & Bennett, J. G. (Joel G.)
Partner: UNT Libraries Government Documents Department

A crust and upper mantle model of Eurasia and North Africa for Pn travel time calculation

Description: We develop a Regional Seismic Travel Time (RSTT) model and methods to account for the first-order effect of the three-dimensional crust and upper mantle on travel times. The model parameterization is a global tessellation of nodes with a velocity profile at each node. Interpolation of the velocity profiles generates a 3-dimensional crust and laterally variable upper mantle velocity. The upper mantle velocity profile at each node is represented as a linear velocity gradient, which enables travel time computation in approximately 1 millisecond. This computational speed allows the model to be used in routine analyses in operational monitoring systems. We refine the model using a tomographic formulation that adjusts the average crustal velocity, mantle velocity at the Moho, and the mantle velocity gradient at each node. While the RSTT model is inherently global and our ultimate goal is to produce a model that provides accurate travel time predictions over the globe, our first RSTT tomography effort covers Eurasia and North Africa, where we have compiled a data set of approximately 600,000 Pn arrivals that provide path coverage over this vast area. Ten percent of the tomography data are randomly selected and set aside for testing purposes. Travel time residual variance for the validation data is reduced by 32%. Based on a geographically distributed set of validation events with epicenter accuracy of 5 km or better, epicenter error using 16 Pn arrivals is reduced by 46% from 17.3 km (ak135 model) to 9.3 km after tomography. Relative to the ak135 model, the median uncertainty ellipse area is reduced by 68% from 3070 km{sup 2} to 994 km{sup 2}, and the number of ellipses with area less than 1000 km{sup 2}, which is the area allowed for onsite inspection under the Comprehensive Nuclear Test Ban Treaty, is increased from 0% to 51%.
Date: March 19, 2009
Creator: Myers, S; Begnaud, M; Ballard, S; Pasyanos, M; Phillips, W S; Ramirez, A et al.
Partner: UNT Libraries Government Documents Department

Assessment of Tidal Energy Removal Impacts on Physical Systems: Development of MHK Module and Analysis of Effects on Hydrodynamics

Description: In this report we describe (1) the development, test, and validation of the marine hydrokinetic energy scheme in a three-dimensional coastal ocean model (FVCOM); and (2) the sensitivity analysis of effects of marine hydrokinetic energy configurations on power extraction and volume flux in a coastal bay. Submittal of this report completes the work on Task 2.1.2, Effects of Physical Systems, Subtask 2.1.2.1, Hydrodynamics and Subtask 2.1.2.3, Screening Analysis, for fiscal year 2011 of the Environmental Effects of Marine and Hydrokinetic Energy project.
Date: September 1, 2011
Creator: Yang, Zhaoqing & Wang, Taiping
Partner: UNT Libraries Government Documents Department

Validation of International Atomic Energy Agency Equipment Performance Requirements

Description: Performance requirements and testing protocols are needed to ensure that equipment used by the International Atomic Energy Agency (IAEA) is reliable. Oak Ridge National Laboratory (ORNL), through the US Support Program, tested equipment to validate performance requirements protocols used by the IAEA for the subject equipment categories. Performance protocol validation tests were performed in the Environmental Effects Laboratory in the categories for battery, DC power supply, and uninterruptible power supply (UPS). Specific test results for each piece of equipment used in the validation process are included in this report.
Date: February 17, 2004
Creator: Chiaro, PJ
Partner: UNT Libraries Government Documents Department

Vehicle to Grid Communication Standards Development, Testing and Validation - Status Report

Description: In the US, more than 10,000 electric vehicles (EV) have been delivered to consumers during the first three quarters of 2011. A large majority of these vehicles are battery electric, often requiring 220 volt charging. Though the vehicle manufacturers and charging station manufacturers have provided consumers options for charging preferences, there are no existing communications between consumers and the utilities to manage the charging demand. There is also wide variation between manufacturers in their approach to support vehicle charging. There are in-vehicle networks, charging station networks, utility networks each using either cellular, Wi-Fi, ZigBee or other proprietary communication technology with no standards currently available for interoperability. The current situation of ad-hoc solutions is a major barrier to the wide adoption of electric vehicles. SAE, the International Standards Organization/International Electrotechnical Commission (ISO/IEC), ANSI, National Institute of Standards and Technology (NIST) and several industrial organizations are working towards the development of interoperability standards. PNNL has participated in the development and testing of these standards in an effort to accelerate the adoption and development of communication modules.
Date: September 1, 2011
Creator: Gowri, Krishnan; Pratt, Richard M.; Tuffner, Francis K. & Kintner-Meyer, Michael CW
Partner: UNT Libraries Government Documents Department

On Attaching a Wire to a Triangulated Surface

Description: There have been many papers that have focused on the attachment of wires to surfaces. The focus of this paper will be on wires connected to arbitrarily shaped surfaces, a body that may be modeled with triangles as described in [1]. The basis function for the wire-to-surface junction is constructed by building the 1/r variation of the surface current near the junction into the surface current. In the following we summarize junction bases as currently used. In the presentation we consider their numerical implementation, examine alternative formulations, and review validation studies that prove the approach is robust with respect to wire orientation and surface geometry at the junction.
Date: January 28, 2002
Creator: Champagne, N J; Johnson, W A & Wilton, D R
Partner: UNT Libraries Government Documents Department

Some Examples of the Application and Validation of the NUFT Subsurface Flow and Transport Code

Description: This report was written as partial fulfillment of a subcontract from DOD/DOE Strategic Environmental Research and Development Program (SERDP) as part of a project directed by the U.S. Army Engineer Research and Development Center, Waterways Experiment Station (WES), Vicksburg, Mississippi. The report documents examples of field validation of the Non-isothermal Unsaturated-saturated Flow and Transport model (NUFT) code for environmental remediation, with emphasis on soil vapor extraction, and describes some of the modifications needed to integrate the code into the DOD Groundwater Modeling System (GMS, 2000). Note that this report highlights only a subset of the full capabilities of the NUFT code.
Date: August 1, 2001
Creator: Nitao, J J
Partner: UNT Libraries Government Documents Department

Validation, Uncertainty, and Quantitative Reliability at Confidence (QRC)

Description: This paper represents a summary of our methodology for Verification and Validation and Uncertainty Quantification. A graded scale methodology is presented and related to other concepts in the literature. We describe the critical nature of quantified Verification and Validation with Uncertainty Quantification at specified Confidence levels in evaluating system certification status. Only after Verification and Validation has contributed to Uncertainty Quantification at specified confidence can rational tradeoffs of various scenarios be made. Verification and Validation methods for various scenarios and issues are applied in assessments of Quantified Reliability at Confidence and we summarize briefly how this can lead to a Value Engineering methodology for investment strategy.
Date: December 6, 2002
Creator: Logan, R W & Nitta, C K
Partner: UNT Libraries Government Documents Department

Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project: Progress Update

Description: Presentation outlining the progress of DOE's Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project, prepared for the 2006 National Hydrogen Association Meeting.
Date: January 1, 2006
Creator: Wipke, K.; Welch, C.; Thomas, H.; Sprik, S.; Gronich, S.; J., Garbak. et al.
Partner: UNT Libraries Government Documents Department

Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project: Project Overview and Fall 2006 Results

Description: This presentation on NREL's Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project was given by Keith Wipke at the ZEV Technology Symposium on September 15, 2006.
Date: September 1, 2006
Creator: Wipke, K.; Welch, C.; Thomas, H.; Sprik, S.; Gronich, S.; Garbak, J. et al.
Partner: UNT Libraries Government Documents Department

Refinements to the Boolean approach to automatic data editing

Description: Automatic data editing consists of three components: identification of erroneous records, identification of most likely erroneous fields within an erroneous record (fields to impute), and assignment of acceptable values to failing records. Moreover the types of data considered naturally fall into three categories: coded (categorical) data, continuous data, and mixed data (both coded and continuous). For the case of coded data, a natural way to approach automatic data is commonly referred to as the Boolean approach, first developed by Fellegi and Holt. For the fields to impute problem, central to the operation of the Fellegi-Holt approach is the explicit recognition of certain implied edits; Fellegi and Holt orginally required a complete set of edits, and their algorithm to generate this complete set has occasionally had the distinct disadvantage of failing to converge within reasonable time. The primary results of this paper is an algorithm that significantly prunes the Fellegi-Holt edit generation process, yet, nonetheless, generates a sufficient collection of implied edits adequate for the solution of the fields to impute problem. 3 figures.
Date: September 1, 1980
Creator: Liepins, G.E.
Partner: UNT Libraries Government Documents Department

Rigorous, systematic approach to automatic data editing and its statistical basis

Description: Automation data editing is the computerized identification and correction (optional) of data errors. These techniques can provide error statistics that indicate the frequency of various types of data errors, diagnostic information that aids in identifying inadequacies in the data collection system, and a clean data base appropriate for use in further decision making, in modeling, and for inferential purposes. However, before these numerous benefits can be fully realized, certain research problems need to be resolved, and the linkage between statistical error analysis and extreme-value programing needs to be carefully determined. The linkage is provided here for the special case that certain independence and symmetry conditions obtain; also provided are rigorous proofs of results central to the functioning of the Boolean approach to automatic data editing of coded (categorical) data. In particular, sufficient collections of edits are defined, and it is shown that for a fixed objective function the solution to the fields to impute problem is obtainable simply from knowing which edits of the sufficient collection are failed, and this solution is invariant of the particular sufficient collection of edits identified. Similarly, disjoint-sufficient collections of edits are defined, and it is shown that, if the objective function of the fields to impute problem is determined by what Freund and Hartley call the number of involvements in unsatisfied consistency checks, then the objective function will be independent of the disjoint-sufficient collection of edits used.
Date: January 1, 1981
Creator: Liepins, G.E.
Partner: UNT Libraries Government Documents Department

Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

Description: Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes – PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS – that could be used for design simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the “Report of the 2003 RIA R&D Workshop".
Date: June 7, 2013
Creator: Ronningen, Reginald Martin; Remec, Igor & Heilbronn, Lawrence H.
Partner: UNT Libraries Government Documents Department

CdTe Feedstock Development and Validation: Cooperative Research and Development Final Report, CRADA Number CRD-08-00280

Description: The goal of this work was to evaluate different CdTe feedstock formulations (feedstock provided by Redlen) to determine if they would significantly improve CdTe performance with ancillary benefits associated with whether changes in feedstock would affect CdTe cell processing and possibly reliability of cells. Feedstock also included attempts to intentionally dope the CdTe with pre-selected elements.
Date: May 1, 2011
Creator: Albin, D.
Partner: UNT Libraries Government Documents Department

Post-processing V&V Level II ASC Milestone (2843) results.

Description: The 9/30/2008 ASC Level 2 Post-Processing V&V Milestone (Milestone 2843) contains functionality required by the user community for certain verification and validation tasks. These capabilities include fragment detection from CTH simulation data, fragment characterization and analysis, and fragment sorting and display operations. The capabilities were tested extensively both on sample and actual simulations. In addition, a number of stretch criteria were met including a comparison between simulated and test data, and the ability to output each fragment as an individual geometric file.
Date: October 1, 2008
Creator: Karelitz, David B.; Ice, Lisa G.; Wilke, Jason; Moreland, Kenneth D. & Attaway, Stephen W.
Partner: UNT Libraries Government Documents Department

Relation of validation experiments to applications.

Description: Computational and mathematical models are developed in engineering to represent the behavior of physical systems to various system inputs and conditions. These models are often used to predict at other conditions, rather than to just reproduce the behavior of data obtained at the experimental conditions. For example, the boundary or initial conditions, time of prediction, geometry, material properties, and other model parameters can be different at test conditions than those for an anticipated application of a model. Situations for which the conditions may differ include those for which (1) one is in the design phase and a prototype of the system has not been constructed and tested under the anticipated conditions, (2) only one version of a final system can be built and destructive testing is not feasible, or (3) the anticipated design conditions are variable and one cannot easily reproduce the range of conditions with a limited number of carefully controlled experiments. Because data from these supporting experiments have value in model validation, even if the model was tested at different conditions than an anticipated application, methodology is required to evaluate the ability of the validation experiments to resolve the critical behavior for the anticipated application. The methodology presented uses models for the validation experiments and a model for the application to address how well the validation experiments resolve the application. More specifically, the methodology investigates the tradeoff that exists between the uncertainty (variability) in the behavior of the resolved critical variables for the anticipated application and the ability of the validation experiments to resolve this behavior. The important features of this approach are demonstrated through simple linear and non-linear heat conduction examples.
Date: February 1, 2009
Creator: Hamilton, James R. (New Mexico State University, Las Cruces, NM) & Hills, Richard Guy
Partner: UNT Libraries Government Documents Department

Demonstration and Validation Assets: User Manual Development

Description: This report documents the development of a database-supported user manual for DEMVAL assets in the NSTI area of operations and focuses on providing comprehensive user information on DEMVAL assets serving businesses with national security technology applications in southern New Mexico. The DEMVAL asset program is being developed as part of the NSPP, funded by both Department of Energy (DOE) and NNSA. This report describes the development of a comprehensive user manual system for delivering indexed DEMVAL asset information to be used in marketing and visibility materials and to NSTI clients, prospective clients, stakeholders, and any person or organization seeking it. The data about area DEMVAL asset providers are organized in an SQL database with updateable application structure that optimizes ease of access and customizes search ability for the user.
Date: June 30, 2008
Partner: UNT Libraries Government Documents Department

Using Patterns for Multivariate Monitoring and Feedback Control of Linear Accelerator Performance: Proof-of-Concept Research

Description: The report discusses preliminary proof-of-concept research for using the Advanced Data Validation and Verification System (ADVVS), a new INEEL software package, to add validation and verification and multivariate feedback control to the operation of non-destructive analysis (NDA) equipment. The software is based on human cognition, the recognition of patterns and changes in patterns in time-related data. The first project applied ADVVS to monitor operations of a selectable energy linear electron accelerator, and showed how the software recognizes in real time any deviations from the optimal tune of the machine. The second project extended the software method to provide model-based multivariate feedback control for the same linear electron accelerator. The projects successfully demonstrated proof-of-concept for the applications and focused attention on the common application of intelligent information processing techniques.
Date: April 1, 2002
Creator: Cordes, Gail Adele; Van Ausdeln, Leo Anthony & Velasquez, Maria Elena
Partner: UNT Libraries Government Documents Department

Real-World Hydrogen Technology Validation: Preprint

Description: The Department of Energy, the Department of Defense's Defense Logistics Agency, and the Department of Transportation's Federal Transit Administration have funded learning demonstrations and early market deployments to provide insight into applications of hydrogen technologies on the road, in the warehouse, and as stationary power. NREL's analyses validate the technology in real-world applications, reveal the status of the technology, and facilitate the development of hydrogen and fuel cell technologies, manufacturing, and operations. This paper presents the maintenance, safety, and operation data of fuel cells in multiple applications with the reported incidents, near misses, and frequencies. NREL has analyzed records of more than 225,000 kilograms of hydrogen that have been dispensed through more than 108,000 hydrogen fills with an excellent safety record.
Date: March 1, 2012
Creator: Sprik, S.; Kurtz, J.; Wipke, K.; Ramsden, T.; Ainscough, C.; Eudy, L. et al.
Partner: UNT Libraries Government Documents Department

Post-processing V&V level II ASC milestone (2360) results.

Description: The 9/30/2007 ASC Level 2 Post-Processing V&V Milestone (Milestone 2360) contains functionality required by the user community for certain verification and validation tasks. These capabilities include loading of edge and face data on an Exodus mesh, run-time computation of an exact solution to a verification problem, delivery of results data from the server to the client, computation of an integral-based error metric, simultaneous loading of simulation and test data, and comparison of that data using visual and quantitative methods. The capabilities were tested extensively by performing a typical ALEGRA HEDP verification task. In addition, a number of stretch criteria were met including completion of a verification task on a 13 million element mesh.
Date: September 1, 2007
Creator: Chavez, Elmer; Karelitz, David B.; Brunner, Thomas A.; Trucano, Timothy Guy; Moreland, Kenneth D.; Weirs, V. Gregory et al.
Partner: UNT Libraries Government Documents Department

CFD INVESTIGATION OF EXPERIMENTAL DATA PROPOSED TO BE A VALIDATION DATA SET

Description: The U. S. Department of Energy (DOE) is currently supporting the development of a next generation nuclear plant (NGNP). The NGNP is based on the very high temperature reactor (VHTR), which is a Gen. IV gas-cooled reactor concept that will use helium as the coolant. Computational fluid dynamics (CFD) calculations are to be employed to estimate the details of the flow and heat transfer in the lower plenum where the heated coolant empties before exiting the reactor vessel. While it is expected that CFD will be able to provide detailed information about the flow, it must be validated using experimental data. Detailed experimental data have been taken in the INL’s matched index of refraction (MIR) facility of a scaled model of a section of the prismatic VHTR lower plenum. The present article examines the data that were taken to determine the suitability of such data to be a validation data set for CFD calculations. CFD calculations were made to compare with the experimental data to explore potential issues and make recommendations regarding the MIR data.
Date: July 1, 2009
Creator: Johnson, Richard W.
Partner: UNT Libraries Government Documents Department