10 Matching Results

Search Results

Advanced search parameters have been applied.

Investigation of novel decay B _____ ____(2S)____K at BaBar

Description: We investigate the undocumented B meson decay, B{sup +} {yields} {Psi}(2S){omega}K{sup +}. The data were collected with the BaBar detector at the SLAC PEP-II asymmetric-energy e{sup +}e{sup -} collier operating at the {gamma}(4S) resonance, a center-of-mass energy of 10.58 GeV/c{sup 2}. The {gamma}(4S) resonance primarily decays to pairs of B-mesons. The BaBar collaboration at the PEP-II ring was located at the SLAC National Accelerator Laboratory and was designed to study the collisions of positrons and electrons. The e{sup -}e{sup +} pairs collide at asymmetric energies, resulting in a center of mass which is traveling at relativistic speeds. The resulting time dilation allows the decaying particles to travel large distances through the detector before undergoing their rapid decays, a process that occurs in the in the center of mass frame over extremely small distances. As they travel through silicon vertex trackers, a drift chamber, a Cerenkov radiation detector and finally an electromagnetic calorimeter, we measure the charge, energy, momentum, and particle identification in order to reconstruct the decays that have occurred. While all well understood mesons currently fall into the qq model, the quark model has no a priori exclusion of higher configuration states such as qqqq which has led experimentalists and theorists alike to seek evidence supporting the existence of such states. Currently, there are hundreds of known decay modes of the B mesons cataloged by the Particle Data Group, but collectively they only account for approximately 60% of the B branching fraction and it is possible that many more exist.
Date: June 22, 2011
Creator: Schalch, Jacob & /SLAC, /Oberlin Coll.
Partner: UNT Libraries Government Documents Department

Diborane Electrode Response in 3D Silicon Sensors for the CMS and ATLAS Experiments

Description: Unusually high leakage currents have been measured in test wafers produced by the manufacturer SINTEF containing 3D pixel silicon sensor chips designed for the ATLAS (A Toroidal LHC ApparatuS) and CMS (Compact Muon Solenoid) experiments. Previous data has shown the CMS chips as having a lower leakage current after processing than ATLAS chips. Some theories behind the cause of the leakage currents include the dicing process and the usage of copper in bump bonding, and with differences in packaging and handling between the ATLAS and CMS chips causing the disparity between the two. Data taken at SLAC from a SINTEF wafer with electrodes doped with diborane and filled with polysilicon, before dicing, and with indium bumps added contradicts this past data, as ATLAS chips showed a lower leakage current than CMS chips. It also argues against copper in bump bonding and the dicing process as main causes of leakage current as neither were involved on this wafer. However, they still display an extremely high leakage current, with the source mostly unknown. The SINTEF wafer shows completely different behavior than the others, as the FEI3s actually performed better than the CMS chips. Therefore this data argues against the differences in packaging and handling or the intrinsic geometry of the two as a cause in the disparity between the leakage currents of the chips. Even though the leakage current in the FEI3s overall is lower, the current is still significant enough to cause problems. As this wafer was not diced, nor had it any copper added for bump bonding, this data argues against the dicing and bump bonding as causes for leakage current. To compliment this information, more data will be taken on the efficiency of the individual electrodes of the ATLAS and CMS chips on this wafer. The electrodes will be ...
Date: June 22, 2011
Creator: Brown, Emily R. & /SLAC, /Reed Coll.
Partner: UNT Libraries Government Documents Department

PyDecay/GraphPhys: A Unified Language and Storage System for Particle Decay Process Descriptions

Description: To ease the tasks of Monte Carlo (MC) simulation and event reconstruction (i.e. inferring particle-decay events from experimental data) for long-term BaBar data preservation and analysis, the following software components have been designed: a language ('GraphPhys') for specifying decay processes, common to both simulation and data analysis, allowing arbitrary parameters on particles, decays, and entire processes; an automated visualization tool to show graphically what decays have been specified; and a searchable database storage mechanism for decay specifications. Unlike HepML, a proposed XML standard for HEP metadata, the specification language is designed not for data interchange between computer systems, but rather for direct manipulation by human beings as well as computers. The components are interoperable: the information parsed from files in the specification language can easily be rendered as an image by the visualization package, and conversion between decay representations was implemented. Several proof-of-concept command-line tools were built based on this framework. Applications include building easier and more efficient interfaces to existing analysis tools for current projects (e.g. BaBar/BESII), providing a framework for analyses in future experimental settings (e.g. LHC/SuperB), and outreach programs that involve giving students access to BaBar data and analysis tools to give them a hands-on feel for scientific analysis.
Date: June 22, 2011
Creator: Dunietz, Jesse N. & /SLAC, /MIT
Partner: UNT Libraries Government Documents Department

Parallelizing AT with MatlabMPI

Description: The Accelerator Toolbox (AT) is a high-level collection of tools and scripts specifically oriented toward solving problems dealing with computational accelerator physics. It is integrated into the MATLAB environment, which provides an accessible, intuitive interface for accelerator physicists, allowing researchers to focus the majority of their efforts on simulations and calculations, rather than programming and debugging difficulties. Efforts toward parallelization of AT have been put in place to upgrade its performance to modern standards of computing. We utilized the packages MatlabMPI and pMatlab, which were developed by MIT Lincoln Laboratory, to set up a message-passing environment that could be called within MATLAB, which set up the necessary pre-requisites for multithread processing capabilities. On local quad-core CPUs, we were able to demonstrate processor efficiencies of roughly 95% and speed increases of nearly 380%. By exploiting the efficacy of modern-day parallel computing, we were able to demonstrate incredibly efficient speed increments per processor in AT's beam-tracking functions. Extrapolating from prediction, we can expect to reduce week-long computation runtimes to less than 15 minutes. This is a huge performance improvement and has enormous implications for the future computing power of the accelerator physics group at SSRL. However, one of the downfalls of parringpass is its current lack of transparency; the pMatlab and MatlabMPI packages must first be well-understood by the user before the system can be configured to run the scripts. In addition, the instantiation of argument parameters requires internal modification of the source code. Thus, parringpass, cannot be directly run from the MATLAB command line, which detracts from its flexibility and user-friendliness. Future work in AT's parallelization will focus on development of external functions and scripts that can be called from within MATLAB and configured on multiple nodes, while expending minimal communication overhead with the integrated MATLAB library.
Date: June 22, 2011
Creator: Li, Evan Y. & /SLAC, /Brown U.
Partner: UNT Libraries Government Documents Department

LSST Charge-Coupled Device Calibration

Description: The prototype charge-coupled device created at the Stanford Linear Accelerator Center for the Large Synoptic Survey Telescope must be tested to check its functionality and performance. It was installed into the Calypso telescope in Arizona in November of 2008 for this purpose. Since then it has taken many images of various astronomical objects. By doing photometry on standard stars in these images, we can compare our magnitude results to the known magnitudes of these stars. This comparison allows us to then determine the chip's performance and functional capabilities. Expecting to see first light in 2016, the Large Synoptic Survey Telescope (LSST) is an extremely large ground based telescope that anticipates funding and will be built in Chile. Described as 'Wide-Fast-Deep', the LSST will have an unprecedented wide field of view (ten square degrees for surveys), short exposures (fifteen to thirty seconds and still see faint objects), and the largest digital camera in the world. One of the goals hoped to be achieved with this camera is the measurement of dark matter using strong and weak gravitational lensing. Gravitational lensing occurs when a large cluster of galaxies distorts the light from a galaxy behind this cluster. This causes an arc of light to form around the cluster. By measuring the length of this arc, one can calculate how much matter should be present in the cluster. Since the amount that should be present is vastly greater than the amount of visible matter that can be seen, it is postulated that the difference between these two numbers is made up of dark matter. This is a direct way of measuring the amount of dark matter in the universe. Thousands of galaxy clusters will be seen with LSST, allowing precise measurements of strong lensing effects. Weak lensing is a much smaller effect, distorting ...
Date: June 22, 2011
Creator: Stout, Tiarra Johannas & /SLAC, /Idaho State U.
Partner: UNT Libraries Government Documents Department

Analysis of Femtosecond Timing Noise and Stability in Microwave Components

Description: To probe chemical dynamics, X-ray pump-probe experiments trigger a change in a sample with an optical laser pulse, followed by an X-ray probe. At the Linac Coherent Light Source, LCLS, timing differences between the optical pulse and x-ray probe have been observed with an accuracy as low as 50 femtoseconds. This sets a lower bound on the number of frames one can arrange over a time scale to recreate a 'movie' of the chemical reaction. The timing system is based on phase measurements from signals corresponding to the two laser pulses; these measurements are done by using a double-balanced mixer for detection. To increase the accuracy of the system, this paper studies parameters affecting phase detection systems based on mixers, such as signal input power, noise levels, temperature drift, and the effect these parameters have on components such as the mixers, splitters, amplifiers, and phase shifters. Noise data taken with a spectrum analyzer show that splitters based on ferrite cores perform with less noise than strip-line splitters. The data also shows that noise in specific mixers does not correspond with the changes in sensitivity per input power level. Temperature drift is seen to exist on a scale between 1 and 27 fs/{sup o}C for all of the components tested. Results show that any components using more metallic conductor tend to exhibit more noise as well as more temperature drift. The scale of these effects is large enough that specific care should be given when choosing components and designing the housing of high precision microwave mixing systems for use in detection systems such as the LCLS. With these improvements, the timing accuracy can be improved to lower than currently possible.
Date: June 22, 2011
Creator: Whalen, Michael R. & /SLAC, /Stevens Tech.
Partner: UNT Libraries Government Documents Department

Scientific Needs for Future X-Ray Sources in the U.S.: A White Paper

Description: Many of the important challenges facing humanity, including developing alternative sources of energy and improving health, are being addressed by advances that demand the improved understanding and control of matter. While the visualization, exploration, and manipulation of macroscopic matter have long been technological goals, scientific developments in the twentieth century have focused attention on understanding matter on the atomic scale through the underlying framework of quantum mechanics. Of special interest is matter that consists of natural or artificial nanoscale building blocks defined either by atomic structural arrangements or by electron or spin formations created by collective correlation effects (Figure 1.1). The essence of the challenge to the scientific community has been expressed in five grand challenges for directing matter and energy recently formulated by the Basic Energy Sciences Advisory Committee. These challenges focus on increasing our understanding of, and ultimately control of, matter at the level of atoms, electrons, and spins, as illustrated in Figure 1.1. Meeting these challenges will require new tools that extend our reach into regions of higher spatial, temporal, and energy resolution. Since the fundamental interaction that holds matter together is of electromagnetic origin, it is intuitively clear that electromagnetic radiation is the critical tool in the study of material properties. On the level of atoms, electrons and spins, x rays have proved especially valuable.
Date: October 22, 2008
Creator: Falcone , Roger; Stohr, Joachim; Bergmann, Uwe; Corlett, John; Galayda, John; Hastings, Jerry et al.
Partner: UNT Libraries Government Documents Department

Comparing the Calibration and Simulation Data of the Cryogenic Dark Matter Search

Description: The Cryogenic Dark Matter Search, or CDMS, collaboration is preparing a new experiment called SuperCDMS. CDMS uses Germanium detectors to attempt the direct detection of dark matter. To do this, they measure the ionization and heat produced during an event where a WIMP scatters off of germanium crystal lattice. To prepare for the experiment the detectors are calibrated with various radioactive sources. The response of the detectors is also modeled by a Monte Carlo simulation. These simulations include modeling everything from the radiation production to the raw data collected by the detector. The experimental data will be used to validate the results of the detector simulation. This research will look only at the phonons produced during events that occur very close to the detector surface. From the raw data and simulation output three parameters will be determined: the rise time, the decay time, and time to position independence. It was found that the simulation's risetime and time to position independence was generally smaller than that of the data, while the decay time was found to be larger in the simulation than in the data. These differences show that the simulation is not complete. The difference in risetime implies that the phonons are not spread out enough when they reach the detector walls, which would be improved by a look at the Luke phonon and charge transport. The long decay time in the simulation implies that the rate phonons are being absorbed is underestimated. Finally, the small time to position independence in the simulation could be due to a low phonon scattering rate. A simple solution may be to alter the parameters that control the simulation, while still remaining physically sensible, to help match simulation and data.
Date: June 22, 2011
Creator: DiFranzo, Anthony & /SLAC, /Rensselaer Poly.
Partner: UNT Libraries Government Documents Department

Calibration Analyses and Efficiency Studies for the Anti Coincidence Detector on the Fermi Gamma Ray Space Telescope

Description: The Anti Coincidence Detector (ACD) on the Fermi Gamma Ray Space Telescope provides charged particle rejection for the Large Area Telescope (LAT). We use two calibrations used by the ACD to conduct three studies on the performance of the ACD. We examine the trending of the calibrations to search for damage and find a timescale over which the calibrations can be considered reliable. We also calculated the number of photoelectrons counted by a PMT on the ACD from a normal proton. Third, we calculated the veto efficiencies of the ACD for two different veto settings. The trends of the calibrations exhibited no signs of damage, and indicated timescales of reliability for the calibrations of one to two years. The number of photoelectrons calculated ranged from 5 to 25. Large errors in the effect of the energy spectrum of the charged particles caused these values to have very large errors of around 60 percent. Finally, the veto efficiencies were found to be very high at both veto values, both for charged particles and for the lower energy backsplash spectrum. The Anti Coincidence Detector (ACD) on the Fermi Gamma Ray Space Telescope is a detector system built around the silicon strip tracker on the Large Area Telescope (LAT). The purpose of the ACD is to provide charged particle rejection for the LAT. To do this, the ACD must be calibrated correctly in flight, and must be able to efficiently veto charged particle events while minimizing false vetoes due to 'backsplash' from photons in the calorimeter. There are eleven calibrations used by the ACD. In this paper, we discuss the use of two of these calibrations to preform three studies on the performance of the ACD. The first study examines trending of the calibrations to check for possible hardware degradation. The second study ...
Date: June 22, 2011
Creator: Kachulis, Chris & /SLAC, /Yale U.
Partner: UNT Libraries Government Documents Department

Clicks versus Citations: Click Count as a Metric in High Energy Physics Publishing

Description: High-energy physicists worldwide rely on online resources such as SPIRES and arXiv to perform gather research and share their own publications. SPIRES is a tool designed to search the literature within high-energy physics, while arXiv provides the actual full-text documents of this literature. In high-energy physics, papers are often ranked according to the number of citations they acquire - meaning the number of times a later paper references the original. This paper investigates the correlation between the number of times a paper is clicked in order to be downloaded and the number of citations it receives following the click. It explores how physicists truly read what they cite.
Date: June 22, 2011
Creator: Bitton, Ayelet & /UC, San Diego /SLAC
Partner: UNT Libraries Government Documents Department