2,028 Matching Results

Search Results

Advanced search parameters have been applied.

Analysis of V-G records from the SNJ-4 airplane

Description: Report discusses an attempt to adapt a method of analysis of V-G data to predict the occurrences of large values of airspeed and acceleration from an SNJ-4 airplane; the data are presented as "flight envelopes". The flight envelopes "predict that, on average, in a stated number of flight hours, one airspeed, one positive acceleration, and one negative acceleration will exceed the envelope with equal probability of the accelerations being experienced at any airspeed" (from Discussion). The analysis method shows promise for predicting flight loads and speeds for airplanes on which loads due to maneuvers predominate.
Date: December 1945
Creator: Wilkerson, M. & Bennett, S. A.
Partner: UNT Libraries Government Documents Department

Occam's Razor and Petascale Visual Data Analysis

Description: One of the central challenges facing visualization research is how to effectively enable knowledge discovery. An effective approach will likely combine application architectures that are capable of running on today?s largest platforms to address the challenges posed by large data with visual data analysis techniques that help find, represent, and effectively convey scientifically interesting features and phenomena.
Date: June 12, 2009
Creator: Bethel, E. Wes; Johnson, Chris; Ahern, Sean; Bell, John; Bremer, Peer-Timo; Childs, Hank et al.
Partner: UNT Libraries Government Documents Department

Long-Term X-Ray Variability of Typical Active Galactic Nuclei in the Distant Universe

Description: This article discusses the long-term (≈15 years, observed frame) X-ray variability analyses of the 68 brightest radio-quiet active galactic nuclei (AGNs) in the 6 Ms Chandra Deep Field-South survey.
Date: November 3, 2016
Creator: Yang, G.; Brandt, William Nielsen; Luo, Bin; Xue, Yongquan; Bauer, Franz E.; Sun, M. et al.
Partner: UNT College of Arts and Sciences

ATLAS Metadata Task Force

Description: This document provides an overview of the metadata, which are needed to characterizeATLAS event data at different levels (a complete run, data streams within a run, luminosity blocks within a run, individual events).
Date: April 4, 2007
Creator: Collaboration, ATLAS; Costanzo, D.; Cranshaw, J.; Gadomski, S.; Jezequel, S.; Klimentov, A. et al.
Partner: UNT Libraries Government Documents Department

SQL Data Analysis Procedures to Create Aggregate and Candidate Record Groups on Sample of Decomposed MARC Records Phase 1 Testing

Description: This document describes the data analysis procedures developed to create the Aggregate and Candidate Record Groups using SQL statements. This is the preliminary version of these procedures tested and validated on a sample of decomposed MARC records. (For a description of how the MARC records were decomposed see the Z-Interop document, Decomposing MARC 21 Records for Analysis. A subsequent version may be necessary as the authors move to the procedures for the entire file of decomposed records.
Date: October 14, 2001
Creator: Yoon, JungWon & Moen, William E.
Partner: UNT College of Information

Representing regional P/S discriminants for event indentification: a comparison of distance corrections, path parameter regressions, cap-averaging and kriging

Description: Short-period regional P/S amplitude ratios hold much promise for discriminating low magnitude explosions from earthquakes in a Comprehensive Test Ban Treaty monitoring context. However, propagation effects lead to variability in regional phase amplitudes that if not accounted for can reduce or eliminate the ability of P/S ratios to discriminate the seismic source. lo this study, several representations of short-period regional P/S amplitude ratios are compared in order to determine which methodology best accounts for the effect of heterogeneous structure on P/S amplitudes. These methodologies are: I) distance corrections, including azimuthal subdivision of the data; 2) path specific crustal waveguide parameter regressions; 3) cap-averaging (running mean smoothing); and 4) kriging. The "predictability" of each method is established by cross-validation (leave-one-out) analysis. We apply these techniques to represent Pn/Lg, Pg/Lg and Pn/Sn observations in three frequency bands (0.75-6.0 Hz) at station ABKT (Alibek, Turkmenistan), site of a primary seismic station of the It~temational Monitoring System (IMS). Paths to ABKT sample diverse crustal stmctores (e.g. various topographic, sedimentary and geologic structures), leading to great variability in the observed P/S amplitude ratios. Subdivision of the data be back-azimuth leads to stronger distance trends than that for the entire data set. This observation alone indicates that path propagation effects due to laterally varying shucture are important for the P/S ratios recorded at ABKT. For these data to be useful for isolating source characteristics, the scatter needs to be reduced by accounting for the path effects and the resulting P/S ratio distribution needs to Gaussian for spatial interpolation and discrimination strategies to be most effective. Each method reduces the scatter of the P/S ratios with varying degrees of success, however kriging has the distinct advantages of providing the greatest variance reduction and a continuous correction surface with an estimate of the model uncertainty. The largest scatter ...
Date: June 18, 1998
Creator: Myers, S C; Rodgers, A J; Schultz, C A & Walter, W R
Partner: UNT Libraries Government Documents Department

Type Ia Supernova Intrinsic Magnitude Dispersion and the Fitting of Cosmological Parameters

Description: I present an analysis for fitting cosmological parameters from a Hubble Diagram of a standard candle with unknown intrinsic magnitude dispersion. The dispersion is determined from the data themselves, simultaneously with the cosmological parameters. This contrasts with the strategies used to date. The advantages of the presented analysis are that it is done in a single fit (it is not iterative), it provides a statistically founded and unbiased estimate of the intrinsic dispersion, and its cosmological-parameter uncertainties account for the intrinsic dispersion uncertainty. Applied to Type Ia supernovae, my strategy provides a statistical measure to test for sub-types and assess the significance of any magnitude corrections applied to the calibrated candle. Parameter bias and differences between likelihood distributions produced by the presented and currently-used fitters are negligibly small for existing and projected supernova data sets.
Date: December 10, 2010
Creator: Kim, Alex G
Partner: UNT Libraries Government Documents Department

FastBit: Interactively Searching Massive Data

Description: As scientific instruments and computer simulations produce more and more data, the task of locating the essential information to gain insight becomes increasingly difficult. FastBit is an efficient software tool to address this challenge. In this article, we present a summary of the key underlying technologies, namely bitmap compression, encoding, and binning. Together these techniques enable FastBit to answer structured (SQL) queries orders of magnitude faster than popular database systems. To illustrate how FastBit is used in applications, we present three examples involving a high-energy physics experiment, a combustion simulation, and an accelerator simulation. In each case, FastBit significantly reduces the response time and enables interactive exploration on terabytes of data.
Date: June 23, 2009
Creator: Wu, Kesheng; Ahern, Sean; Bethel, E. Wes; Chen, Jacqueline; Childs, Hank; Cormier-Michel, Estelle et al.
Partner: UNT Libraries Government Documents Department

Visualization and Analysis of 3D Gene Expression Data

Description: Recent methods for extracting precise measurements ofspatial gene expression patterns from three-dimensional (3D) image dataopens the way for new analysis of the complex gene regulatory networkscontrolling animal development. To support analysis of this novel andhighly complex data we developed PointCloudXplore (PCX), an integratedvisualization framework that supports dedicated multi-modal, physical andinformation visualization views along with algorithms to aid in analyzingthe relationships between gene expression levels. Using PCX, we helpedour science stakeholders to address many questions in 3D gene expressionresearch, e.g., to objectively define spatial pattern boundaries andtemporal profiles of genes and to analyze how mRNA patterns arecontrolled by their regulatory transcription factors.
Date: October 25, 2007
Creator: Bethel, E. Wes; Rubel, Oliver; Weber, Gunther H.; Hamann, Bernd & Hagen, Hans
Partner: UNT Libraries Government Documents Department

Planning for a program design for energy environmental analysis. Progress report, April 1, 1975--June 30, 1975

Description: The work reported in this second quarterly progress report has focussed on completing the first generation approach to the use of operational gaming in a regional assessment study program, writing an interim report on the work accomplished, and progress on the second generation approach which attempts to quantify the mechanisms identified in the first generation approach. The interim report, provided as an addendum to this report, provides detailed descriptions of the first generation work. This report provides a brief summary of the initial work on the second generation approach as well as the first generation approach. (auth)
Date: January 1, 1975
Creator: Denton, J C
Partner: UNT Libraries Government Documents Department

Data Analysis of Early Fuel Cell Market Demonstrations (Presentation)

Description: Presentation about early fuel cell markets, the National Renewable Energy Laboratory's Hydrogen Secure Data Center and its role in data analysis and demonstrations, and composite data products, and results reported to multiple stakeholders.
Date: November 17, 2009
Creator: Kurtz, J.; Ramsden, T.; Wipke, K. & Sprik, S.
Partner: UNT Libraries Government Documents Department

NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

Description: Projects for the Very High Temperature Reactor Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. In addition, thermal-hydraulic experiments are conducted to validate codes used to assess reactor safety. The Very High Temperature Reactor Technology Development Office has established the NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory to ensure that very high temperature reactor data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities for displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities.
Date: September 1, 2010
Creator: Gentillon, Cynthia D.
Partner: UNT Libraries Government Documents Department

Interfacing interactive data analysis tools with the grid: The PPDG CS-11 activity

Description: For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics DataGrid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services.'' The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments (ALICE, ATLAS, BaBar, D0, CMS, JLab, STAR, others welcome), to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis frame work developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed.
Date: October 9, 2002
Creator: Olson, Douglas L. & Perl, Joseph
Partner: UNT Libraries Government Documents Department

STANDARDIZATION OF CEBAF 12 GEV UPGRADE CAVITY TESTING

Description: CEBAF 12GeV upgrade project includes 80 new 7-cell cavities to form 10 cryomodules. Each cavity underwent RF qualification at 2.07K using a high power accelerating gradient test and an HOM survey in Jefferson Lab's Vertical Testing Area (VTA) before cavity string assembly. In order to ensure consistently high quality data, updated cavity testing procedures and analysis were implemented and used by a group of VTA operators. For high power tests, a cavity testing procedure was developed and used in conjunction with a LabVIEW program to collect the test data. Additionally while the cavity was at 2.07K, an HOM survey was performed using a network analyzer and a combination of Excel and Mathematica programs. Data analysis was standardized and an online logbook, Pansophy, was used for data storage and mining. The Pansophy system allowed test results to be easily summarized and searchable across all cavity tests. In this presentation, the CEBAF 12GeV upgrade cavity testing procedure, method for data analysis, and results reporting results will be discussed.
Date: July 1, 2012
Creator: Tiffany Bass, G. Davis, Christiana Wilson, Mircea Stirbet
Partner: UNT Libraries Government Documents Department

Quantitative Visualization of ChIP-chip Data by Using Linked Views

Description: Most analyses of ChIP-chip in vivo DNA binding have focused on qualitative descriptions of whether genomic regions are bound or not. There is increasing evidence, however, that factors bind in a highly overlapping manner to the same genomic regions and that it is quantitative differences in occupancy on these commonly bound regions that are the critical determinants of the different biological specificity of factors. As a result, it is critical to have a tool to facilitate the quantitative visualization of differences between transcription factors and the genomic regions they bind to understand each factor's unique roles in the network. We have developed a framework which combines several visualizations via brushing-and-linking to allow the user to interactively analyze and explore in vivo DNA binding data of multiple transcription factors. We describe these visualization types and also provide a discussion of biological examples in this paper.
Date: November 5, 2010
Creator: Huang, Min-Yu; Weber, Gunther; Li, Xiao-Yong; Biggin, Mark & Hamann, Bernd
Partner: UNT Libraries Government Documents Department

A Variation of the F-Test for Determining Statistical Relevance ofParticular Parameters in EXAFS Fits

Description: A general problem when fitting EXAFS data is determining whether particular parameters are statistically significant. The F-test is an excellent way of determining relevancy in EXAFS because it only relies on the ratio of the fit residual of two possible models, and therefore the data errors approximately cancel. Although this test is widely used in crystallography (there, it is often called a 'Hamilton test') and has been properly applied to EXAFS data in the past, it is very rarely applied in EXAFS analysis. We have implemented a variation of the F-test adapted for EXAFS data analysis in the RSXAP analysis package, and demonstrate its applicability with a few examples, including determining whether a particular scattering shell is warranted, and differentiating between two possible species or two possible structures in a given shell.
Date: July 25, 2006
Creator: Downward, L.; Booth, C.H.; Lukens, W.W. & Bridges, F.
Partner: UNT Libraries Government Documents Department

Phase II Corrective Action Investigation Plan for Corrective Action Units 101 and 102: Central and Western Pahute Mesa, Nevada Test Site, Nye County, Nevada, Revision 2

Description: This Phase II CAIP describes new work needed to potentially reduce uncertainty and achieve increased confidence in modeling results. This work includes data collection and data analysis to refine model assumptions, improve conceptual models of flow and transport in a complex hydrogeologic setting, and reduce parametric and structural uncertainty. The work was prioritized based on the potential to reduce model uncertainty and achieve an acceptable level of confidence in the model predictions for flow and transport, leading to model acceptance by NDEP and completion of the Phase II CAI stage of the UGTA strategy.
Date: July 1, 2009
Creator: Wurtz, Jeff
Partner: UNT Libraries Government Documents Department

FY08 LDRD Final Report LOCAL: Locality-Optimizing Caching Algorithms and Layouts

Description: This project investigated layout and compression techniques for large, unstructured simulation data to reduce bandwidth requirements and latency in simulation I/O and subsequent post-processing, e.g. data analysis and visualization. The main goal was to eliminate the data-transfer bottleneck - for example, from disk to memory and from central processing unit to graphics processing unit - through coherent data access and by trading underutilized compute power for effective bandwidth and storage. This was accomplished by (1) designing algorithms that both enforce and exploit compactness and locality in unstructured data, and (2) adapting offline computations to a novel stream processing framework that supports pipelining and low-latency sequential access to compressed data. This report summarizes the techniques developed and results achieved, and includes references to publications that elaborate on the technical details of these methods.
Date: February 27, 2009
Creator: Lindstrom, P
Partner: UNT Libraries Government Documents Department

Analysis of the January 2006 Pepper-Pot Experiments

Description: Between January 9-12, 2006 a series of experiments were performed on the DARHT-II injector to measure the beam's emittance. Part of these experiments were pepper-pot measurements. This note describes the analysis of the data, and our conclusions from the experiments.
Date: March 22, 2006
Creator: Westenskow, G; Chambers, F; Bieniosek, F & Henestroza, E
Partner: UNT Libraries Government Documents Department

Barrier Immune Radio Communications for Demand Response

Description: Various wireless technologies were field-tested in a six-story laboratory building to identify wireless technologies that can scale for future DR applications through very low node density power consumption, and unit cost. Data analysis included analysis of the signal-to-noise ratio (SNR), packet loss, and link quality at varying power levels and node densities. The narrowband technologies performed well, penetrating the floors of the building with little loss and exhibiting better range than the wideband technology. 900 MHz provided full coverage at 1 watt and substantially complete coverage at 500 mW at the test site. 900 MHz was able to provide full coverage at 100 mW with only one additional relay transmitter, and was the highest-performing technology in the study. 2.4 GHz could not provide full coverage with only a single transmitter at the highest power level tested (63 mW). However, substantially complete coverage was provided at 2.4 GHz at 63 mW with the addition of one repeater node.
Date: February 1, 2009
Creator: Rubinstein, Francis; Ghatikar, Girish; Granderson, Jessica; Haugen, Paul; Romero, Carlos & Watson, David
Partner: UNT Libraries Government Documents Department

Schematic model of nuclear spin excitations

Description: A simple model to estimate the strength of spin and nonspin collective states is presented. The model was inspired by early schematic models based on energy-weighted sum rules and is a useful tool for interpreting experimental data without the complexities of realistic microscopic calculations. The strength of collective states is calculated by assuming that a single collective state completely exhausts the energy-weighted sum rule.
Date: September 1, 1990
Creator: Boucher, Patrick
Partner: UNT Libraries Government Documents Department