100 Matching Results

Search Results

Advanced search parameters have been applied.

1987 wet deposition temporal and spatial patterns in North America

Description: The focus of this report is on North American wet deposition temporal patterns from 1979 to 1987 and spatial patterns for 1987. The report investigates the patterns of annual precipitation-weighted average concentration and annual deposition for nine ion species: hydrogen, sulfate, nitrate, ammonium, calcium, chloride, sodium, potassium, and magnesium. Data are from the Acid Deposition System (ADS) for the statistical reporting of North American deposition data which includes the National Atmospheric Deposition Program/National Trends Network (NADP/NTN), the MAP3S precipitation chemistry network, the Utility Acid Precipitation Study Program (UAPSP), the Canadian Precipitation Monitoring Network (CAPMoN), and the daily and 4-weekly Acidic Precipitation in Ontario Study (APIOS-D and APIOS-C). Mosaic maps, based on surface estimation using kriging, display concentration and deposition spatial patterns of pH, hydrogen, sulfate, nitrate, ammonium, and calcium ion species for 1987 annual, winter, and summer periods. The temporal pattern analyses use a subset of 39 sites over a 9-year (1979--1987) period and an expanded subset of 140 sites with greater spatial coverage over a 6-year (1982--1987) period. 68 refs., 15 figs., 15 tabs.
Date: March 1, 1990
Creator: Simpson, J.C. & Olsen, A.R.
Partner: UNT Libraries Government Documents Department

3D optical sectioning with a new hyperspectral confocal fluorescence imaging system.

Description: A novel hyperspectral fluorescence microscope for high-resolution 3D optical sectioning of cells and other structures has been designed, constructed, and used to investigate a number of different problems. We have significantly extended new multivariate curve resolution (MCR) data analysis methods to deconvolve the hyperspectral image data and to rapidly extract quantitative 3D concentration distribution maps of all emitting species. The imaging system has many advantages over current confocal imaging systems including simultaneous monitoring of numerous highly overlapped fluorophores, immunity to autofluorescence or impurity fluorescence, enhanced sensitivity, and dramatically improved accuracy, reliability, and dynamic range. Efficient data compression in the spectral dimension has allowed personal computers to perform quantitative analysis of hyperspectral images of large size without loss of image quality. We have also developed and tested software to perform analysis of time resolved hyperspectral images using trilinear multivariate analysis methods. The new imaging system is an enabling technology for numerous applications including (1) 3D composition mapping analysis of multicomponent processes occurring during host-pathogen interactions, (2) monitoring microfluidic processes, (3) imaging of molecular motors and (4) understanding photosynthetic processes in wild type and mutant Synechocystis cyanobacteria.
Date: February 1, 2007
Creator: Nieman, Linda T.; Sinclair, Michael B.; Davidson, George S.; Van Benthem, Mark Hilary; Haaland, David Michael; Timlin, Jerilyn Ann et al.
Partner: UNT Libraries Government Documents Department

Adaptable Multivariate Calibration Models for Spectral Applications

Description: Multivariate calibration techniques have been used in a wide variety of spectroscopic situations. In many of these situations spectral variation can be partitioned into meaningful classes. For example, suppose that multiple spectra are obtained from each of a number of different objects wherein the level of the analyte of interest varies within each object over time. In such situations the total spectral variation observed across all measurements has two distinct general sources of variation: intra-object and inter-object. One might want to develop a global multivariate calibration model that predicts the analyte of interest accurately both within and across objects, including new objects not involved in developing the calibration model. However, this goal might be hard to realize if the inter-object spectral variation is complex and difficult to model. If the intra-object spectral variation is consistent across objects, an effective alternative approach might be to develop a generic intra-object model that can be adapted to each object separately. This paper contains recommendations for experimental protocols and data analysis in such situations. The approach is illustrated with an example involving the noninvasive measurement of glucose using near-infrared reflectance spectroscopy. Extensions to calibration maintenance and calibration transfer are discussed.
Date: December 20, 1999
Partner: UNT Libraries Government Documents Department

Adaptive scanning probe microscopies

Description: This work is comprised of two major sections. In the first section the authors develop multivariate image classification techniques to distinguish and identify surface electronic species directly from multiple-bias scanning tunneling microscope (STM) images. Multiple measurements at each site are used to distinguish and categorize inequivalent electronic or atomic species on the surface via a computerized classification algorithm. Then, comparison with theory or other suitably chosen experimental data enables the identification of each class. They demonstrate the technique by analyzing dual-polarity constant-current topographs of the Ge(111) surface. Just two measurements, negative- and positive-bias topography height, permit pixels to be separated into seven different classes. Labeling four of the classes as adatoms, first-layer atoms, and two inequivalent rest-atom sites, they find excellent agreement with the c(2 x 8) structure. The remaining classes are associated with structural defects and contaminants. This work represents a first step toward developing a general electronic/chemical classification and identification tool for multivariate scanning probe microscopy imagery. In the second section they report measurements of the diffusion of Si dimers on the Si(001) surface at temperatures between room temperature and 128 C using a novel atom-tracking technique that can resolve every diffusion event. The atom tracker employs lateral-positioning feedback to lock the STM probe tip into position above selected atoms with sub-Angstrom precision. Once locked the STM tracks the position of the atoms as they migrate over the crystal surface. By tracking individual atoms directly, the ability of the instrument to measure dynamic events is increased by a factor of {approximately} 1,000 over conventional STM imaging techniques.
Date: February 1, 1997
Creator: Swartzentruber, B.S.; Bouchard, A.M. & Osbourn, G.C.
Partner: UNT Libraries Government Documents Department

An analysis of the impacts of economic incentive programs on commercial nuclear power plant operations and maintenance costs

Description: Operations and Maintenance (O and M) expenditures by nuclear power plant owner/operators possess a very logical and vital link in considerations relating to plant safety and reliability. Since the determinants of O and M outlays are considerable and varied, the potential linkages to plant safety, both directly and indirectly, can likewise be substantial. One significant issue before the US Nuclear Regulatory Commission is the impact, if any, on O and M spending from state programs that attempt to improve plant operating performance, and how and to what extent these programs may affect plant safety and pose public health risks. The purpose of this study is to examine the role and degree of impacts from state promulgated economic incentive programs (EIPs) on plant O and M spending. A multivariate regression framework is specified, and the model is estimated on industry data over a five-year period, 1986--1990. Explanatory variables for the O and M spending model include plant characteristics, regulatory effects, financial strength factors, replacement power costs, and the performance incentive programs. EIPs are found to have statistically significant effects on plant O and M outlays, albeit small in relation to other factors. Moreover, the results indicate that the relatively financially weaker firms are more sensitive in their O and M spending to the presence of such programs. Formulations for linking spending behavior and EIPs with plant safety performance remains for future analysis.
Date: February 1, 1996
Creator: Kavanaugh, D.C.; Monroe, W.H. & Wood, R.S.
Partner: UNT Libraries Government Documents Department

An Application of Multivariate Statistical Analysis for Query-Driven Visualization

Description: Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Date: March 1, 2010
Creator: Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes & Joy, Kenneth I.
Partner: UNT Libraries Government Documents Department

Assessing Measurement Equivalence of the English and Spanish Versions on an Employee Attitude Survey Using Multigroup Analysis in Structural Equation Modeling.

Description: The study utilized the covariance structure comparison methodology - Multigroup Analysis in Structural Equation Modeling - evaluating measurement equivalence of English and Spanish versions of an employee opinion survey. The concept of measurement equivalence was defined as consisting of four components: sample equivalence, semantic equivalence, conceptual equivalence and scalar equivalence. The results revealed that the two language versions of the survey exhibited acceptable measurement equivalence across five survey dimensions Communications, Supervision, Leadership, Job Content & Satisfaction and Company Image & Commitment. Contrary to the study second hypothesis, there was no meaningful difference in opinion scores between English-speaking and Spanish-speaking respondents on the latent construct of Job Content & Satisfaction.
Date: August 2003
Creator: Koulikov, Mikhail
Partner: UNT Libraries

Automated detection and analysis of particle beams in laser-plasma accelerator simulations

Description: Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread in momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for ...
Date: May 21, 2010
Creator: Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, , et al.
Partner: UNT Libraries Government Documents Department

Bias and Precision of the Squared Canonical Correlation Coefficient under Nonnormal Data Conditions

Description: This dissertation: (a) investigated the degree to which the squared canonical correlation coefficient is biased in multivariate nonnormal distributions and (b) identified formulae that adjust the squared canonical correlation coefficient (Rc2) such that it most closely approximates the true population effect under normal and nonnormal data conditions. Five conditions were manipulated in a fully-crossed design to determine the degree of bias associated with Rc2: distribution shape, variable sets, sample size to variable ratios, and within- and between-set correlations. Very few of the condition combinations produced acceptable amounts of bias in Rc2, but those that did were all found with first function results. The sample size to variable ratio (n:v)was determined to have the greatest impact on the bias associated with the Rc2 for the first, second, and third functions. The variable set condition also affected the accuracy of Rc2, but for the second and third functions only. The kurtosis levels of the marginal distributions (b2), and the between- and within-set correlations demonstrated little or no impact on the bias associated with Rc2. Therefore, it is recommended that researchers use n:v ratios of at least 10:1 in canonical analyses, although greater n:v ratios have the potential to produce even less bias. Furthermore,because it was determined that b2 did not impact the accuracy of Rc2, one can be somewhat confident that, with marginal distributions possessing homogenous kurtosis levels ranging anywhere from -1 to 8, Rc2 will likely be as accurate as that resulting from a normal distribution. Because the majority of Rc2 estimates were extremely biased, it is recommended that all Rc2 effects, regardless of which function from which they result, be adjusted using an appropriate adjustment formula. If no rationale exists for the use of another formula, the Rozeboom-2 would likely be a safe choice given that it produced the greatest ...
Date: August 2006
Creator: Leach, Lesley Ann Freeny
Partner: UNT Libraries

Bipartite graph partitioning and data clustering

Description: Many data types arising from data mining applications can be modeled as bipartite graphs, examples include terms and documents in a text corpus, customers and purchasing items in market basket analysis and reviewers and movies in a movie recommender system. In this paper, the authors propose a new data clustering method based on partitioning the underlying biopartite graph. The partition is constructed by minimizing a normalized sum of edge weights between unmatched pairs of vertices of the bipartite graph. They show that an approximate solution to the minimization problem can be obtained by computing a partial singular value decomposition (SVD) of the associated edge weight matrix of the bipartite graph. They point out the connection of their clustering algorithm to correspondence analysis used in multivariate analysis. They also briefly discuss the issue of assigning data objects to multiple clusters. In the experimental results, they apply their clustering algorithm to the problem of document clustering to illustrate its effectiveness and efficiency.
Date: May 7, 2001
Creator: Zha, Hongyuan; He, Xiaofeng; Ding, Chris; Gu, Ming & Simon, Horst D.
Partner: UNT Libraries Government Documents Department

CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

Description: Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.
Date: December 1, 1999
Creator: Peterson, James T.
Partner: UNT Libraries Government Documents Department

Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

Description: Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples. Applying these techniques to information-rich mass spectral data sets opens the possibilities for new applications including classification of subtly different biological samples that ...
Date: October 24, 2007
Creator: Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S & Wu, K J
Partner: UNT Libraries Government Documents Department

Chemometrics applied to vibrational spectroscopy: overview, challenges and pitfalls

Description: Chemometric multivariate calibration methods are rapidly impacting quantitative infrared spectroscopy in many positive ways. The combination of vibrational spectroscopy and chemometrics has been used by industry for quality control and process monitoring. The growth of these methods has been phenomenal in the past decade. Yet, as with any new technology, there are growing pains. The methods are so powerful at finding correlations in the data, that when used without great care they can readily yield results that are not valid for the analysis of future unknown samples. In this paper, the power of the multivariate calibration methods is discussed while pointing out common pitfalls and some remaining challenges that may slow the implementation of chemometrics in research and industry.
Date: October 1, 1996
Creator: Haaland, D.M.
Partner: UNT Libraries Government Documents Department

The comparative uptake and interaction of several radionuclides in the trophic levels surrounding the Los Alamos Meson Physics Facility (LAMPF) waste water ponds

Description: A study was undertaken to examine the uptake, distribution, and interaction of five activation products (Co-57, Be-7, Cs-134, Rb-83, and Mn-54) within the biotic and abiotic components surrounding the waste treatment lagoons of the Los Alamos Meson Physics Facility (LAMPF). The study attempted to ascertain where, and what specific interactions were taking place among the isotopes and the biotic/abiotic components. A statistical approach, utilizing Multivariate Analysis of Variance (MANOVA), was conducted testing the radioisotopic concentrations by (1) the trophic levels (TROPLVL) in each position sampled on the grid, (2) where sampled on the grid (TRAN), (3) where sampled with-in each grid line (PLOT), and (4) the side with which sampled (SIDE). This provided both the dependent and independent variables that would be tested. The Null Hypothesis (Ho) tested the difference in the mean values of the isotopes within/between each of the four independent variables. The Rb-83 statistic indicated an accumulation within the TRAN and PLOT variables within the sampled area. The Co-57 test statistic provided a value which indicated that accumulation of this isotope within TROPLVL was taking place. Mn-54 test values indicated that accumulation was also taking place at the higher trophic levels within the PLOT, TRAN, and SIDE positions. Cs-134 was found to accumulate to third level in this trophic level structure (TROPLVL-(vegetation)), and then decrease from there. The Be-7 component provided no variance from known compartmental transfers. 210 refs., 17 figs., 4 tabs.
Date: August 1, 1989
Creator: Brooks, G.H. Jr.
Partner: UNT Libraries Government Documents Department

Comparing Candidate Hospital Report Cards

Description: We present graphical and analytical methods that focus on multivariate outlier detection applied to the hospital report cards data. No two methods agree which hospitals are unusually good or bad, so we also present ways to compare the agreement between two methods. We identify factors that have a significant impact on the scoring.
Date: December 31, 1997
Creator: Burr, T.L.; Rivenburgh, R.D.; Scovel, J.C. & White, J.M.
Partner: UNT Libraries Government Documents Department

A Comparison of Multivariate Normal and Elliptical Estimation Methods in Structural Equation Models

Description: In the present study, parameter estimates, standard errors and chi-square statistics were compared using normal and elliptical estimation methods given three research conditions: population data contamination (10%, 20%, and 30%), sample size (100, 400, and 1000), and kurtosis (kappa =1,10, 20).
Date: August 1999
Creator: Cheevatanarak, Suchittra
Partner: UNT Libraries

Control of DWPF (Defense Waste Processing Facility) melter feed composition

Description: The Defense Waste Processing Facility will be used to immobilize Savannah River Site high-level waste into a stable borosilicate glass for disposal in a geologic repository. Proper control of the melter feed composition in this facility is essential to the production of glass which meets product durability constraints dictated by repository regulations and facility processing constraints dictated by melter design. A technique has been developed which utilizes glass property models to determine acceptable processing regions based on the multiple constraints imposed on the glass product and to display these regions graphically. This system along with the batch simulation of the process is being used to form the basis for the statistical process control system for the facility. 13 refs., 3 figs., 1 tab.
Date: January 1, 1990
Creator: Edwards, R.E. Jr.; Brown, K.G. & Postles, R.L.
Partner: UNT Libraries Government Documents Department

Dark Photon Search at BABAR

Description: Presented is the current progress of a search for the signature of a dark photon or new particle using the BaBar data set. We search for the processes e{sup +}e{sup -} {yields} {gamma}{sub ISR}A{prime},A{prime} {yields} e{sup +}e{sup -} and e{sup +}e{sup -} {yields} {gamma}{sub ISR}{gamma}, {gamma} {yields} A{prime},A{prime} {yields} e{sup +}e{sup -}, where {gamma}{sub ISR} is an initial state radiated photon of energy E{sub {gamma}} >= 1 GeV. Twenty-five sets of Monte Carlo, simulating e{sup +}e{sup -} collisions at an energy of 10.58 GeV, were produced with different values of the A{prime} mass ranging from 100 MeV to 9.5 GeV. The mass resolution is calculated based on Monte Carlo simulations. We implement ROOT's Toolkit for Multivariate Analysis (TMVA), a machine learning tool that allows us to evaluate the signal character of events based on many of discriminating variables. TMVA training is conducted with samples of Monte Carlo as signal and a small portion of Run 6 as background. The multivariate analysis produces additional cuts to separate signal and background. The signal efficiency and sensitivity are calculated. The analysis will move forward to fit the background and scan the residuals for the narrow resonance peak of a new particle.
Date: September 7, 2012
Creator: Greenwood, Ross N & /SLAC, /MIT
Partner: UNT Libraries Government Documents Department

Decreased expression of RNA interference machinery, Dicer and Drosha, is associated with poor outcome in ovarian cancer patients

Description: The clinical and functional significance of RNA interference (RNAi) machinery, Dicer and Drosha, in ovarian cancer is not known and was examined. Dicer and Drosha expression was measured in ovarian cancer cell lines (n=8) and invasive epithelial ovarian cancer specimens (n=111) and correlated with clinical outcome. Validation was performed with previously published cohorts of ovarian, breast, and lung cancer patients. Anti-Galectin-3 siRNA and shRNA transfections were used for in vitro functional studies. Dicer and Drosha mRNA and protein levels were decreased in 37% to 63% of ovarian cancer cell lines and in 60% and 51% of human ovarian cancer specimens, respectively. Low Dicer was significantly associated with advanced tumor stage (p=0.007), and low Drosha with suboptimal surgical cytoreduction (p=0.02). Tumors with both high Dicer and Drosha were associated with increased median patient survival (>11 years vs. 2.66 years for other groups; p<0.001). In multivariate analysis, high Dicer (HR=0.48; p=0.02), high-grade histology (HR=2.46; p=0.03), and poor chemoresponse (HR=3.95; p<0.001) were identified as independent predictors of disease-specific survival. Findings of poor clinical outcome with low Dicer expression were validated in separate cohorts of cancer patients. Galectin-3 silencing with siRNA transfection was superior to shRNA in cell lines with low Dicer (78-95% vs. 4-8% compared to non-targeting sequences), and similar in cell lines with high Dicer. Our findings demonstrate the clinical and functional impact of RNAi machinery alterations in ovarian carcinoma and support the use of siRNA constructs that do not require endogenous Dicer and Drosha for therapeutic applications.
Date: May 6, 2008
Creator: Merritt, William M.; Lin, Yvonne G.; Han, Liz Y.; Kamat, Aparna A.; Spannuth, Whitney A.; Schmandt, Rosemarie et al.
Partner: UNT Libraries Government Documents Department

A detailed examination of the chemical, hydrological, and geological properties influencing the mobility of {sup 222}radon and parent radionuclides in groundwater

Description: This study examines hydrological, geological and geochemical controls on {sup 222}Rn variability in groundwater in the Front Range of Colorado. Specific objectives of the study are: (1) to determine if there are any correlations or spatial relationships between {sup 222}Rn and the geological, geochemical and hydrogeological data; and (2) to determine whether it is geochemically reasonable for observed {sup 222}Rn levels to be the result of U and {sup 226}Ra accumulation by fracture filling minerals. Domestic-water wells were sampled and tested to determine the local aquifer characteristics and aqueous geochemistry. A multivariate and staged approach was used in the data analyses. Analysis of variance tests were used to test for relationships between {sup 222}Rn and the lithology of the study wells. The effects of rock-type were then removed from the chemical and hydrological variables by subtracting the mean value for each rock-type from each of the measured values within that rock-type (a residual transformation). Linear and linear multiple regression techniques were used to test for expected relationships between residual {sup 222}Rn levels and these variables, and stepwise linear regressions were used to test for any unforeseen multivariate relationships in the data. Correlograms, distance-weighted average and inverse-distance-weighted average predictions were used to look for spatial relationships in the data.
Date: December 31, 1996
Creator: Sexsmith, K.S.
Partner: UNT Libraries Government Documents Department

Detection and Classification of Individual Airborne Microparticles using Laser Ablation Mass Spectroscopy and Multivariate Analysis

Description: We are developing a method for the real-time analysis of airborne microparticles based on laser ablation mass spectroscopy. Airborne particles enter an ion trap mass spectrometer through a differentially-pumped inlet, are detected by light scattered from two CW laser beams, and sampled by a 10 ns excimer laser pulse at 308 nm as they pass through the center of the ion trap electrodes. After the laser pulse, the stored ions are separated by conventional ion trap methods. In this work thousands of positive and negative ion spectra were collected for eighteen different species: six bacteria, six pollen, and six particulate samples. The data were then averaged and analyzed using the Multivariate Patch Algorithm (MPA), a variant of traditional multivariate anal ysis. The MPA correctly identified all of the positive ion spectra and 17 of the 18 negative ion spectra. In addition, when the average positive and negative spectra were combined the MPA correctly identified all 18 species. Finally, the MPA is also able to identify the components of computer synthesized mixtures of the samples studied
Date: April 27, 1999
Creator: Gieray, R.A.; Lazar, A.; Parker, E.P.; Ramsey, J. M.; Reilly, P.T.A.; Rosenthal, S.E. et al.
Partner: UNT Libraries Government Documents Department

Development of new VOC exposure metrics and their relationship to ``Sick Building Syndrome`` symptoms

Description: Volatile organic compounds (VOCs) are suspected to contribute significantly to ``Sick Building Syndrome`` (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs ({Sigma}VOC{sub i})). Also not useful were three other VOC metrics that took into account potency, but did not adjust for the highly correlated nature of the data set, or the presence of VOCs that were not measured. High TVOC values (2--7 mg m{sup {minus}3}) due to the presence of liquid-process photocopiers observed in several study spaces significantly influenced symptoms. Analyses without the high TVOC values reduced, but did not eliminate the ability of the VOC exposure metric based on irritancy and principal component analysis to explain symptom outcome.
Date: August 1, 1995
Creator: Ten Brinke, J.
Partner: UNT Libraries Government Documents Department

Distribution selection in statistical simulation studies

Description: The statistics profession has been remiss in exploiting the numerous advances in simulation methodology. The purpose of this article is to outline progress in variate generation relevant to the conduct of statistical simulation studies. The emphasis is on multivariate distributions, a thriving area of research. 11 refs.
Date: January 1, 1986
Creator: Johnson, M.E.
Partner: UNT Libraries Government Documents Department

Dynamic modeling of physical phenomena for PRAs using neural networks

Description: In most probabilistic risk assessments, there is a set of accident scenarios that involves the physical responses of a system to environmental challenges. Examples include the effects of earthquakes and fires on the operability of a nuclear reactor safety system, the effects of fires and impacts on the safety integrity of a nuclear weapon, and the effects of human intrusions on the transport of radionuclides from an underground waste facility. The physical responses of the system to these challenges can be quite complex, and their evaluation may require the use of detailed computer codes that are very time consuming to execute. Yet, to perform meaningful probabilistic analyses, it is necessary to evaluate the responses for a large number of variations in the input parameters that describe the initial state of the system, the environments to which it is exposed, and the effects of human interaction. Because the uncertainties of the system response may be very large, it may also be necessary to perform these evaluations for various values of modeling parameters that have high uncertainties, such as material stiffnesses, surface emissivities, and ground permeabilities. The authors have been exploring the use of artificial neural networks (ANNs) as a means for estimating the physical responses of complex systems to phenomenological events such as those cited above. These networks are designed as mathematical constructs with adjustable parameters that can be trained so that the results obtained from the networks will simulate the results obtained from the detailed computer codes. The intent is for the networks to provide an adequate simulation of the detailed codes over a significant range of variables while requiring only a small fraction of the computer processing time required by the detailed codes. This enables the authors to integrate the physical response analyses into the probabilistic models in order ...
Date: April 1, 1998
Creator: Benjamin, A. S.; Brown, N. N. & Paez, T. L.
Partner: UNT Libraries Government Documents Department