14 Matching Results

Search Results

Advanced search parameters have been applied.

A User's Guide for RAPID, Reduction Algorithms for the Presentation of Incremental Fire Data

Description: Abstract: The Voluminous amount of data than can be collected by automatic data acquisition systems during large scale fire tests requires the use of a digital computer for the reduction of data. RAPID is a stand-alone program specifically designed to convert raw instrument voltages collected during such tests into meaningful units. The reduced data can also be used alone or in combinations to obtain quantities that require more than minimal data reduction. The program is written with the ability to accept data from a user defined data acquisition system, with the ability to check the correctness of data included. Through the use of input data provided by the user, the data can be converted into meaningful scientific units. The data can then be presented in tabular or printer plot form, or stored for further processing. This user's guide provides detailed instructions for the use of the program.
Date: August 1986
Creator: Breese, J. Newton & Peacock, Richard D.
Partner: UNT Libraries Government Documents Department

Gemini Series Experiment Data Reduction and Storage Techniques

Description: The presentation covers data formats expected from Gemini experiments; data quick look vs. in-depth analysis; iPDV object-oriented data storage; iPDV's traceability of analysis results; optimizing object memory usage in iPDV; and long-term archival of data objects by iPDV.
Date: November 1, 2011
Creator: Berglin, R. A.
Partner: UNT Libraries Government Documents Department

A Generalized Computer Program for Flowsheet Calculation and Process Data Reduction

Description: Report issued by the Argonne National Laboratory discussing the PACER-65 computer program. As stated in the summary, the program "has been developed and utilized for flow sheet calculations and process-data reduction. PACER-65 is an executive program in which material- and energy-balance equations, conversion factors, etc., for each processing step are described by separate subroutines" (p. 5). This report includes tables, and illustrations.
Date: April 1966
Creator: Koppel, L. B.; Alfredson, P. G.; Anastasia, L. J.; Knudsen, I. E. & Vogel, G. J.
Partner: UNT Libraries Government Documents Department

Preliminary analysis of the International Data Centre pipeline.

Description: The International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization relies on automatic data processing as the first step in identifying seismic events from seismic waveform data. However, more than half of the automatically identified seismic events are eliminated by IDC analysts. Here, an IDC dataset is analyzed to determine if the number of automatically generated false positives could be reduced. Data that could be used to distinguish false positives from analyst-accepted seismic events includes the number of stations, the number of phases, the signal-to-noise ratio, and the pick error. An empirical method is devised to determine whether an automatically identified seismic event is acceptable, and the method is found to identify a significant number of the false positives in IDC data. This work could help reduce seismic analyst workload and could help improve the calibration of seismic monitoring stations. This work could also be extended to address identification of seismic events missed by automatic processing.
Date: July 1, 2009
Creator: Gauthier, John Henry
Partner: UNT Libraries Government Documents Department