104 Matching Results

Search Results

Advanced search parameters have been applied.

Measurement issues in assessing employee performance: A generalizability theory approach

Description: Increasingly, organizations are assessing employee performance through the use of rating instruments employed in the context of varied data collection strategies. For example, the focus may be on obtaining multiple perspectives regarding employee performance (360{degree} evaluation). From the standpoint of evaluating managers, upward assessments and ``peer to peer`` evaluations are perhaps two of the more common examples of such a multiple perspective approach. Unfortunately, it is probably fair to say that the increased interest and use of such data collection strategies has not been accompanied by a corresponding interest in addressing both validity and reliability concerns that have traditionally been associated with other forms of employee assessment (e.g., testing, assessment centers, structured interviews). As a consequence, many organizations may be basing decisions upon information collected under less than ideal measurement conditions. To the extent that such conditions produce unreliable measurements, the process may be both dysfunctional to the organization and/or unfair to the individual(s) being evaluated. Conversely, the establishment of reliable and valid measurement processes may in itself support the utilization of results in pursuit of organizational goals and enhance the credibility of the measurement process (see McEvoy (1990), who found the acceptance of subordinate ratings to be related to perceived accuracy and fairness of the measurement process). The present paper discusses a recent ``peer to peer`` evaluation conducted in our organization. The intent is to focus on the design of the study and present a Generalizability Theory (GT) approach to assessing the overall quality of the data collection strategy, along with suggestions for improving future designs. 9 refs., 3 tabs.
Date: August 1, 1996
Creator: Stephenson, B.O.
Partner: UNT Libraries Government Documents Department

Partial least squares, conjugate gradient and the fisher discriminant

Description: The theory of multivariate regression has been extensively studied and is commonly used in many diverse scientific areas. A wide variety of techniques are currently available for solving the problem of multivariate calibration. The volume of literature on this subject is so extensive that understanding which technique to apply can often be very confusing. A common class of techniques for solving linear systems, and consequently applications of linear systems to multivariate analysis, are iterative methods. While common linear system solvers typically involve the factorization of the coefficient matrix A in solving the system Ax = b, this method can be impractical if A is large and sparse. Iterative methods such as Gauss-Seidel, SOR, Chebyshev semi-iterative, and related methods also often depend upon parameters that require calibration and which are sometimes hard to choose properly. An iterative method which surmounts many of these difficulties is the method of conjugate gradient. Algorithms of this type find solutions iteratively, by optimally calculating the next approximation from the residuals.
Date: December 1996
Creator: Faber, V.
Partner: UNT Libraries Government Documents Department

Dynamical system modeling via signal reduction and neural network simulation

Description: Many dynamical systems tested in the field and the laboratory display significant nonlinear behavior. Accurate characterization of such systems requires modeling in a nonlinear framework. One construct forming a basis for nonlinear modeling is that of the artificial neural network (ANN). However, when system behavior is complex, the amount of data required to perform training can become unreasonable. The authors reduce the complexity of information present in system response measurements using decomposition via canonical variate analysis. They describe a method for decomposing system responses, then modeling the components with ANNs. A numerical example is presented, along with conclusions and recommendations.
Date: November 1, 1997
Creator: Paez, T.L. & Hunter, N.F.
Partner: UNT Libraries Government Documents Department

Multivariate statistical analysis of spectrum lines from Si{sub 3}N{sub 4} grain boundaries

Description: It is well known that the high-temperature properties of polycrystalline Si{sub 3}N{sub 4} ceramics are strongly influenced by the nanometer-scale glassy phase at the grain boundaries. The authors have recently analyzed the variation of the near-edge fine structure (ELNES) of the Si-L{sub 2,3} edges using a combination of TEM spectrum-line acquisition with an imaging filter and multivariate statistical analysis. The glassy phase at the Si{sub 3}N{sub 4} grain boundaries is easily damaged by the fine probes usually used in scanning transmission electron microscopy to acquire ELNES data. Thus an alternative method using a Gatan imaging filter (GIF), called TEM spectrum-line analysis, was used. This technique will be used to correlate variations in grain boundary chemistry and bonding with the observed performance of Si{sub 3}N{sub 4} ceramics.
Date: April 1, 1997
Creator: Rice, P.M.; Alexander, K.B. & Anderson, I.M.
Partner: UNT Libraries Government Documents Department

A detailed examination of the chemical, hydrological, and geological properties influencing the mobility of {sup 222}radon and parent radionuclides in groundwater

Description: This study examines hydrological, geological and geochemical controls on {sup 222}Rn variability in groundwater in the Front Range of Colorado. Specific objectives of the study are: (1) to determine if there are any correlations or spatial relationships between {sup 222}Rn and the geological, geochemical and hydrogeological data; and (2) to determine whether it is geochemically reasonable for observed {sup 222}Rn levels to be the result of U and {sup 226}Ra accumulation by fracture filling minerals. Domestic-water wells were sampled and tested to determine the local aquifer characteristics and aqueous geochemistry. A multivariate and staged approach was used in the data analyses. Analysis of variance tests were used to test for relationships between {sup 222}Rn and the lithology of the study wells. The effects of rock-type were then removed from the chemical and hydrological variables by subtracting the mean value for each rock-type from each of the measured values within that rock-type (a residual transformation). Linear and linear multiple regression techniques were used to test for expected relationships between residual {sup 222}Rn levels and these variables, and stepwise linear regressions were used to test for any unforeseen multivariate relationships in the data. Correlograms, distance-weighted average and inverse-distance-weighted average predictions were used to look for spatial relationships in the data.
Date: December 31, 1996
Creator: Sexsmith, K.S.
Partner: UNT Libraries Government Documents Department

Multivariate prediction: Selection of the most informative components to measure

Description: A number of interesting problems in the design of experiments such as sensor allocation, selection of sites for the observing stations, determining sampler positions in traffic monitoring, and which variables to survey/measure in sampling studies may be considered in the following setting: Given a covariance matrix of multi-dimension random vector and given a ratio of the number of possible observations to the observational error select those components which must be observed to guarantee minimization of an objective function describing the quality of prediction of all or prescribed components. The authors show that the problem can be considered in the framework of convex design theory and derive some simple but effective algorithm for selection of an optimal subset of components to be observed.
Date: June 1, 1998
Creator: Batsell, S.; Fedorov, V. & Flanagan, D.
Partner: UNT Libraries Government Documents Department

Multivariate statistical analysis of particle x-ray spectra

Description: Multivariate statistical analysis (MSA) is a powerful tool for the analysis of series of spectra. This paper explores an application of MSA to a series of energy dispersive X-ray (EDX) spectra acquired in the scanning electron microscope (SEM) from a series of particles. The raw data were series of spectra previously acquired to test analytical procedures for trace element detection. This paper explores the possibility of performing the trace element detection with MSA components that have been extracted from the raw data without any a priori assumptions about the information content of the particle spectra. Particles were prepared from two analytical glasses, dispersed onto carbon substrates and coated with carbon. The compositions of the two glasses are substantially similar, except that one glass (K-3106) contains 0.7 wt.% Fe, whereas the other glass (K-3069) does not contain Fe at a detectable level. Spectra were acquired with a 20 kV accelerating voltage from 35 different particles of each glass, with particle diameters that ranged from 1--10 {micro}m, and with acquisition times of 15, 60 and 200 S. A test data file of 75 spectra was composed for each acquisition time by arranging the 70 acquired spectra in no particular order in a composite file, and inserting 5 duplicate spectra as a test of reproducibility. MSA was performed using software developed at Oak Ridge National Laboratory.
Date: March 1, 1998
Creator: Anderson, I.M. & Small, J.A.
Partner: UNT Libraries Government Documents Department

Steam generator mock-up for assessment of inservice inspection technology.

Description: A steam generator mock-up has been assembled for round-robin studies of the effectiveness of currently practiced inservice inspection (ISI) technology for detection of current-day flaws. The mock-up will also be used to evaluate emerging inspection technologies. The 3.66-m (12-ft.)-tall mock-up contains 400 tube openings, each consisting of 9 test sections that can be used to simulate current-day field-induced flaws and artifacts. Included in the mock-up are simulations of tube support plate (TSP) intersections and the tube sheet (TS). Cracks are present at the TSP, TS, and in the free span sections of the mock-up. For initial evaluation of the round-robin results, various eddy current methods, as well as multivariate models for data analysis techniques, are being used to estimate the depth and length of defects in the mock-up. To ensure that the round-robin is carried out with procedures as close as possible to those implemented in the field, input was obtained from industry experts on the protocol and procedures to be used for the exercise. One initial assembly of the mock-up with a limited number of flaws and artifact has been completed and tested. A second completed configuration with additional flaw and artifacts simulations will be used for the round-robin.
Date: September 11, 1999
Creator: Bakhtiari, S.; Kupperman, D. S. & Muscara, J.
Partner: UNT Libraries Government Documents Department

Using the bootstrap in a multivariadte data problem: An example

Description: The use of the bootstrap in the multivariate version of the paired t-test is considered and demonstrated through an example. The problem of interest involves comparing two different techniques for measuring the chemical constituents of an sample item. The bootstrap is used to form an empirical significance level for Hotelling`s one-sample T-squared statistic. The bootstrap was selected to determine empirical significance levels because the implicit assumption of multivariate normality in the classic Hotelling`s one-sample test night not hold. The results of both the classic and bootstrap test are presented and contrasted.
Date: August 1, 1995
Creator: Glosup, J.G. & Axelrod, M.C.
Partner: UNT Libraries Government Documents Department

A method for detecting changes in long time series

Description: Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.
Date: September 1, 1995
Creator: Downing, D.J.; Lawkins, W.F.; Morris, M.D. & Ostrouchov, G.
Partner: UNT Libraries Government Documents Department

Investigation of an empirical probability measure based test for multivariate normality

Description: Foutz (1980) derived a goodness of fit test for a hypothesis specifying a continuous, p-variate distribution. The test statistic is both distribution-free and independent of p. In adapting the Foutz test for multivariate normality, we consider using chi/sup 2/ and rescaled beta variates in constructing statistically equivalent blocks. The Foutz test is compared to other multivariate normality tests developed by Hawkins (1981) and Malkovich and Afifi (1973). The set of alternative distributions tested include Pearson type II and type VII, Johnson translations, Plackett, and distributions arising from Khintchine's theorem. Univariate alternatives from the general class developed by Johnson et al. (1980) were also used. An empirical study confirms the independence of the test statistic on p even when parameters are estimated. In general, the Foutz test is less conservative under the null hypothesis but has poorer power under most alternatives than the other tests.
Date: January 1, 1984
Creator: Booker, J.M.; Johnson, M.E. & Beckman, R.J.
Partner: UNT Libraries Government Documents Department

From the Outside In: A Multivariate Correlational Analysis of Effectiveness in Communities of Practice

Description: Online communities of practice (CoPs) provide social spaces for people to connect, learn, and engage with one another around shared interests and passions. CoPs are innovatively employed within industry and education for their inherent knowledge management characteristics and as a means of improving professional practice. Measuring the success of a CoP is a challenge researchers are examining through various strategies. Recent literature supports measuring community effectiveness through the perceptions of its members; however, evaluating a community by means of member perception introduces complicating factors from outside the community. In order to gain insight into the importance of external factors, this quantitative study examined the influence of factors in the professional lives of educators on their perceptions of their CoP experience. Through an empirical examination of CoPs employed to connect educators and advance their professional learning, canonical correlation analysis was used to examine correlations between factors believed to be influential on the experiences of community members.
Date: August 2016
Creator: Bomar, Shannon Hulbert
Partner: UNT Libraries

Extracting bb Higgs Decay Signals using Multivariate Techniques

Description: For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.
Date: August 28, 2012
Creator: Smith, W Clarke & /SLAC, /George Washington U.
Partner: UNT Libraries Government Documents Department

Dark Photon Search at BABAR

Description: Presented is the current progress of a search for the signature of a dark photon or new particle using the BaBar data set. We search for the processes e{sup +}e{sup -} {yields} {gamma}{sub ISR}A{prime},A{prime} {yields} e{sup +}e{sup -} and e{sup +}e{sup -} {yields} {gamma}{sub ISR}{gamma}, {gamma} {yields} A{prime},A{prime} {yields} e{sup +}e{sup -}, where {gamma}{sub ISR} is an initial state radiated photon of energy E{sub {gamma}} >= 1 GeV. Twenty-five sets of Monte Carlo, simulating e{sup +}e{sup -} collisions at an energy of 10.58 GeV, were produced with different values of the A{prime} mass ranging from 100 MeV to 9.5 GeV. The mass resolution is calculated based on Monte Carlo simulations. We implement ROOT's Toolkit for Multivariate Analysis (TMVA), a machine learning tool that allows us to evaluate the signal character of events based on many of discriminating variables. TMVA training is conducted with samples of Monte Carlo as signal and a small portion of Run 6 as background. The multivariate analysis produces additional cuts to separate signal and background. The signal efficiency and sensitivity are calculated. The analysis will move forward to fit the background and scan the residuals for the narrow resonance peak of a new particle.
Date: September 7, 2012
Creator: Greenwood, Ross N & /SLAC, /MIT
Partner: UNT Libraries Government Documents Department

3D optical sectioning with a new hyperspectral confocal fluorescence imaging system.

Description: A novel hyperspectral fluorescence microscope for high-resolution 3D optical sectioning of cells and other structures has been designed, constructed, and used to investigate a number of different problems. We have significantly extended new multivariate curve resolution (MCR) data analysis methods to deconvolve the hyperspectral image data and to rapidly extract quantitative 3D concentration distribution maps of all emitting species. The imaging system has many advantages over current confocal imaging systems including simultaneous monitoring of numerous highly overlapped fluorophores, immunity to autofluorescence or impurity fluorescence, enhanced sensitivity, and dramatically improved accuracy, reliability, and dynamic range. Efficient data compression in the spectral dimension has allowed personal computers to perform quantitative analysis of hyperspectral images of large size without loss of image quality. We have also developed and tested software to perform analysis of time resolved hyperspectral images using trilinear multivariate analysis methods. The new imaging system is an enabling technology for numerous applications including (1) 3D composition mapping analysis of multicomponent processes occurring during host-pathogen interactions, (2) monitoring microfluidic processes, (3) imaging of molecular motors and (4) understanding photosynthetic processes in wild type and mutant Synechocystis cyanobacteria.
Date: February 1, 2007
Creator: Nieman, Linda T.; Sinclair, Michael B.; Davidson, George S.; Van Benthem, Mark Hilary; Haaland, David Michael; Timlin, Jerilyn Ann et al.
Partner: UNT Libraries Government Documents Department

FCV Learning Demonstration: Project Midpoint Status and First-Generation Vehicle Results; Preprint

Description: This paper covers the progress accomplished by the U.S. DOE's Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project since inception, including results from analysis of six months of new data.
Date: December 1, 2007
Creator: Wipke, K.; Sprik, S.; Kurtz, J.; Thomas, H. & Garbak, J.
Partner: UNT Libraries Government Documents Department

Spatially Explicit Modeling of West Nile Virus Risk Using Environmental Data

Description: West Nile virus (WNV) is an emerging infectious disease that has widespread implications for public health practitioners across the world. Within a few years of its arrival in the United States the virus had spread across the North American continent. This research focuses on the development of a spatially explicit GIS-based predictive epidemiological model based on suitable environmental factors. We examined eleven commonly mapped environmental factors using both ordinary least squares regression (OLS) and geographically weighted regression (GWR). The GWR model was utilized to ascertain the impact of environmental factors on WNV risk patterns without the confounding effects of spatial non-stationarity that exist between place and health. It identifies the important underlying environmental factors related to suitable mosquito habitat conditions to make meaningful and spatially explicit predictions. Our model represents a multi-criteria decision analysis approach to create disease risk maps under data sparse situations. The best fitting model with an adjusted R2 of 0.71 revealed a strong association between WNV infection risk and a subset of environmental risk factors including road density, stream density, and land surface temperature. This research also postulates that understanding the underlying place characteristics and population composition for the occurrence of WNV infection is important for mitigating future outbreaks. While many spatial and aspatial models have attempted to predict the risk of WNV transmission, efforts to link these factors within a GIS framework are limited. One of the major challenges for such integration is the high dimensionality and large volumes typically associated with such models and data. This research uses a spatially explicit, multivariate geovisualization framework to integrate an environmental model of mosquito habitat with human risk factors derived from socio-economic and demographic variables. Our results show that such an integrated approach facilitates the exploratory analysis of complex data and supports reasoning about the underlying spatial ...
Date: December 2015
Creator: Kala, Abhishek K.
Partner: UNT Libraries

Studies of Nu-mu to Nu-e Oscillation Appearance in the MINOS Experiment

Description: The MINOS experiment uses a long baseline neutrino beam, measured 1 km downstream from its origin in the Near Detector at Fermilab, and 734 km later in the large underground Far Detector in the Soudan mine. By comparing these two measurements, MINOS can probe the atmospheric domain of the neutrino oscillation phenomenology with unprecedented precision. Besides the ability to perform a world leading determination of the {Delta}m{sub 23}{sup 2} and {theta}{sub 23} parameters, via {nu}{sub {mu}} flux disappearance, MINOS has the potential to make a leading measurement of {nu}{sub {mu}} {yields} {nu}{sub e} oscillations in the atmospheric sector by looking for {nu}{sub e} appearance at the Far Detector. The observation of {nu}{sub e} appearance, tantamount to establishing a non-zero value of the {theta}{sub 13} mixing angle, opens the way to studies of CP violation in the leptonic sector, the neutrino spectral mass pattern ordering and neutrino oscillations in matter, the driving motivations of the next generation of neutrino experiments. In this thesis, we study the MINOS potential for measuring {theta}{sub 13} in the context of the MINOS Mock Data Challenge using a multivariate discriminant analysis method. We show the method's validity in the application to {nu}{sub e} event classification and background identification, as well as in its ability to identify a {nu}{sub e} signal in a Mock Data sample generated with undisclosed parameters. An independent shower reconstruction method based on three-dimensional hit matching and clustering was developed, providing several useful discriminator variables used in the multivariate analysis method. We also demonstrate that within 2 years of running, MINOS has the potential to improve the current best limit on {theta}{sub 13}, from the CHOOZ experiment, by a factor of 2.
Date: December 1, 2005
Creator: Pereira e Sousa, Alexandre Bruno & U., /Tufts
Partner: UNT Libraries Government Documents Department

Single top quark production at D0

Description: We present first evidence for the production of single top quarks at the Fermilab Tevatron p{bar p} collider. Using a 0.9 fb{sup -1} dataset, we apply a multivariate analysis to separate signal from background and measure cross section for single top quark production. We use the cross section measurement to directly determine the CKM matrix element that describes the Wtb coupling. We also present results of W0 and charged Higgs searches with the same final states as standard model single top quark production.
Date: July 1, 2008
Creator: Jabeen, S. & U., /Boston
Partner: UNT Libraries Government Documents Department

Search for $WH$ associated production in 5.3 fb$^{-1}$ of $p\bar{p}$ collisions at the Fermilab Tevatron

Description: We present a search for associated production of Higgs and W bosons in p{bar p} collisions at a center of mass energy of {radical}s = 1.96 TeV in 5.3 fb{sup -1} of integrated luminosity recorded by the D0 experiment. Multivariate analysis techniques are applied to events containing one lepton, an imbalance in transverse energy, and one or two b-tagged jets to discriminate a potential WH signal from standard model backgrounds. We observe good agreement between data and background, and set an upper limit of 4.5 (at 95% confidence level and for m{sub H} = 115 GeV) on the ratio of the WH cross section multiplied by the branching fraction of H {yields} b{bar b} to its standard model prediction. A limit of 4.8 is expected from simulation.
Date: December 1, 2010
Creator: Abazov, Victor Mukhamedovich; Abbott, Braden Keim; Acharya, Bannanje Sripath; Adams, Mark Raymond; Adams, Todd; Alexeev, Guennadi D. et al.
Partner: UNT Libraries Government Documents Department

Automated detection and analysis of particle beams in laser-plasma accelerator simulations

Description: Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread in momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for ...
Date: May 21, 2010
Creator: Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, , et al.
Partner: UNT Libraries Government Documents Department

Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

Description: Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples. Applying these techniques to information-rich mass spectral data sets opens the possibilities for new applications including classification of subtly different biological samples that ...
Date: October 24, 2007
Creator: Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S & Wu, K J
Partner: UNT Libraries Government Documents Department

Single top quark production and Vtb at the Tevatron

Description: Single top quark production via the electroweak interaction was observed by the D0 and CDF collaborations at the Tevatron proton-antiproton collider at Fermilab. Multivariate analysis techniques are employed to extract the small single top quark signal. The combined Tevatron cross section is 2.76{sub -0.47}{sup +0.58} pb. This corresponds to a lower limit on the CKM matrix element |V{sub tb}| of 0.77. Also reported are measurements of the t-channel cross section, the top quark polarization in single top quark events, and limits on gluon-quark flavor-changing neutral currents and W{prime} boson production.
Date: September 1, 2010
Creator: Schwienhorst, Reinhard & U., /Michigan State
Partner: UNT Libraries Government Documents Department