Search Results

open access

WHITE PAPER ON PROTON - NUCLEUS COLLISONS.

Description: The role of proton-nucleus (p-A) collisions in the study of strong interactions has a long history. It has been an important testing ground for QCD. At RHIC p-A studies have been recognized since the beginning as important elements of the program. These include so-called baseline measurements in cold nuclear matter, essential (along with p-p studies) to a systematic study of QCD at high temperatures and densities in the search for the quark gluon plasma. Also accessible is a study of QCD in the small x (parton saturation) regime, complementary to physics accessible in high-energy e-p and e-A collisions. The role of p-A physics at RHIC was reviewed and brought into sharp focus at a workshop conducted in October 2000 at BNL; the agenda is shown in Appendix 1. This document summarizes the case for p-A at RHIC during the period covered by the next Nuclear Physics Long Range Plan. In subsequent sections we cover the Physics Issues, Experiment Run Plans and Schedule, Detector Upgrade Issues, and Machine Issues & Upgrades.
Date: March 1, 2001
Creator: ARONSON,S.H. & PENG,J.C.
Partner: UNT Libraries Government Documents Department
open access

A Search for the Higgs Boson Using Very Forward Tracking Detectors with CDF

Description: The authors propose to add high precision track detectors 55 m downstream on both (E and W) sides of CDF, to measure high Feynman-x protons and antiprotons in association with central states. A primary motivation is to search for the Higgs boson, and if it is seen to measure its mass precisely. The track detectors will be silicon strip telescopes backed up by high resolution time-of-flight counters. They will have four spectrometer arms, for both sides of the p and {bar p} beams. The addition of these small detectors effectively converts the Tevatron into a gluon-gluon collider with {radical}s from 0 to {approx} 200 GeV. This experiment also measures millions/year clean high- |t| elastic p{bar p} scattering events and produce millions of pure gluon jets. Besides a wealth of other unique QCD studies they will search for signs of exotic physics such as SUSY and Large Extra Dimensions. They ask the Director to ask the PAC to take note of this Letter of Intent at its April meeting, to consider a proposal at the June meeting and to make a decision at the November 2001 meeting. They request that the Directorage ask the Beams Division to evaluate the consequences and cost of the proposed Tevatron modifications, and CDF to evaluate any effect on its baseline program and to review the technical aspects of the detectors, DAQ and trigger integration.
Date: March 1, 2001
Creator: Albrow, M. G.; Atac, M.; Booth, P.; Crosby, P.; Dunietz, I.; Finley, D. A. et al.
Partner: UNT Libraries Government Documents Department
open access

SEPARATION OF CO2 FROM FLUE GASES BY CARBON-MULTIWALL CARBON NANOTUBE MEMBRANES

Description: Multiwalled carbon nanotubes (MWNT) were found to be an effective separation media for removing CO{sub 2} from N{sub 2}. The separation mechanism favors the selective condensation of CO{sub 2} from the flowing gas stream. Significant uptakes of CO{sub 2} were measured at 30 C and 150 C over the pressure range 0.5 to 5 bar. No measurable uptake of nitrogen was found for this range of conditions. The mass uptake of CO{sub 2} by MWNT was found to increase with increasing temperature. A packed bed of MWNT completely removed CO{sub 2} from a flowing stream of CO{sub 2}/N{sub 2}, and exhibited rapid uptake kinetics for CO{sub 2}.
Date: March 1, 2001
Creator: Andrews, Rodney
Partner: UNT Libraries Government Documents Department
open access

Seismic Characterization of Coal-Mining Seismicity in Utah for CTBT Monitoring

Description: Underground coal mining (down to {approx}0.75 km depth) in the contiguous Wasatch Plateau (WP) and Book Cliffs (BC) mining districts of east-central Utah induces abundant seismicity that is monitored by the University of Utah regional seismic network. This report presents the results of a systematic characterization of mining seismicity (magnitude {le} 4.2) in the WP-BC region from January 1978 to June 2000-together with an evaluation of three seismic events (magnitude {le} 4.3) associated with underground trona mining in southwestern Wyoming during January-August 2000. (Unless specified otherwise, magnitude implies Richter local magnitude, M{sub L}.) The University of Utah Seismograph Stations (UUSS) undertook this cooperative project to assist the University of California Lawrence Livermore National Laboratory (LLNL) in research and development relating to monitoring the Comprehensive Test Ban Treaty (CTBT). The project, which formally began February 28, 1998, and ended September 1, 2000, had three basic objectives: (1) Strategically install a three-component broadband digital seismic station in the WP-BC region to ensure the continuous recording of high-quality waveform data to meet the long-term needs of LLNL, UUSS, and other interested parties, including the international CTBT community. (2) Determine source mechanisms--to the extent that available source data and resources allowed--for comparative seismic characterization of stress release in mines versus earthquakes in the WP-BC study region. (3) Gather and report to LLNL local information on mine operations and associated seismicity, including ''ground truth'' for significant events. Following guidance from LLNL's Technical Representative, the focus of Objective 2 was changed slightly to place emphasis on three mining-related events that occurred in and near the study area after the original work plan had been made, thus posing new targets of opportunity. These included: a magnitude 3.8 shock that occurred close to the Willow Creek coal mine in the Book Cliffs area on February 5, 1998 (UTC …
Date: March 1, 2001
Creator: Arabasz, W J & Pechmann, J C
Partner: UNT Libraries Government Documents Department
open access

MULTISCALE THERMAL-INFRARED MEASUREMENTS OF THE MAUNA LOA CALDERA, HAWAII

Description: Until recently, most thermal infrared measurements of natural scenes have been made at disparate scales, typically 10{sup {minus}3}-10{sup {minus}2} m (spectra) and 10{sup 2}-10{sup 3} m (satellite images), with occasional airborne images (10{sup 1} m) filling the gap. Temperature and emissivity fields are spatially heterogeneous over a similar range of scales, depending on scene composition. A common problem for the land surface, therefore, has been relating field spectral and temperature measurements to satellite data, yet in many cases this is necessary if satellite data are to be interpreted to yield meaningful information about the land surface. Recently, three new satellites with thermal imaging capability at the 10{sup 1}-10{sup 2} m scale have been launched: MTI, TERRA, and Landsat 7. MTI acquires multispectral images in the mid-infrared (3-5{micro}m) and longwave infrared (8-10{micro}m) with 20m resolution. ASTER and MODIS aboard TERRA acquire multispectral longwave images at 90m and 500-1000m, respectively, and MODIS also acquires multispectral mid-infrared images. Landsat 7 acquires broadband longwave images at 60m. As part of an experiment to validate the temperature and thermal emissivity values calculated from MTI and ASTER images, we have targeted the summit region of Mauna Loa for field characterization and near-simultaneous satellite imaging, both on daytime and nighttime overpasses, and compare the results to previously acquired 10{sup {minus}1} m airborne images, ground-level multispectral FLIR images, and the field spectra. Mauna Loa was chosen in large part because the 4x6km summit caldera, flooded with fresh basalt in 1984, appears to be spectrally homogeneous at scales between 10{sup {minus}1} and 10{sup 2} m, facilitating the comparison of sensed temperature. The validation results suggest that, with careful atmospheric compensation, it is possible to match ground measurements with measurements from space, and to use the Mauna Loa validation site for cross-comparison of thermal infrared sensors and temperature/emissivity extraction algorithms.
Date: March 1, 2001
Creator: BALICK, L.; GILLESPIE, A. & AL, ET
Partner: UNT Libraries Government Documents Department
open access

Active Control of Magnetically Levitated Bearings

Description: This report summarizes experimental and test results from a two year LDRD project entitled Real Time Error Correction Using Electromagnetic Bearing Spindles. This project was designed to explore various control schemes for levitating magnetic bearings with the goal of obtaining high precision location of the spindle and exceptionally high rotational speeds. As part of this work, several adaptive control schemes were devised, analyzed, and implemented on an experimental magnetic bearing system. Measured results, which indicated precision positional control of the spindle was possible, agreed reasonably well with simulations. Testing also indicated that the magnetic bearing systems were capable of very high rotational speeds but were still not immune to traditional structural dynamic limitations caused by spindle flexibility effects.
Date: March 1, 2001
Creator: BARNEY, PATRICK S.; LAUFFER, JAMES P.; REDMOND, JAMES M. & SULLIVAN, WILLIAM N.
Partner: UNT Libraries Government Documents Department
open access

Progress in heavy ion drivers inertial fusion energy: From scaled experiments to the integrated research experiment

Description: The promise of inertial fusion energy driven by heavy ion beams requires the development of accelerators that produce ion currents ({approx}100's Amperes/beam) and ion energies ({approx}1-10 GeV) that have not been achieved simultaneously in any existing accelerator. The high currents imply high generalized perveances, large tune depressions, and high space charge potentials of the beam center relative to the beam pipe. Many of the scientific issues associated with ion beams of high perveance and large tune depression have been addressed over the last two decades on scaled experiments at Lawrence Berkeley and Lawrence Livermore National Laboratories, the University of Maryland, and elsewhere. The additional requirement of high space charge potential (or equivalently high line charge density) gives rise to effects (particularly the role of electrons in beam transport) which must be understood before proceeding to a large scale accelerator. The first phase of a new series of experiments in Heavy Ion Fusion Virtual National Laboratory (HIF VNL), the High Current Experiments (HCX), is now being constructed at LBNL. The mission of the HCX will be to transport beams with driver line charge density so as to investigate the physics of this regime, including constraints on the maximum radial filling factor of the beam through the pipe. This factor is important for determining both cost and reliability of a driver scale accelerator. The HCX will provide data for design of the next steps in the sequence of experiments leading to an inertial fusion energy power plant. The focus of the program after the HCX will be on integration of all of the manipulations required for a driver. In the near term following HCX, an Integrated Beam Experiment (IBX) of the same general scale as the HCX is envisioned. The step which bridges the gap between the IBX and an engineering test …
Date: March 1, 2001
Creator: Barnard, J. J.; Ahle, L. E.; Baca, D.; Bangerter, R. O.; Bieniosek, F. M.; Celata, C. M. et al.
Partner: UNT Libraries Government Documents Department
open access

Remote Laser Diffraction Particle Size Distribution Analyzer

Description: In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.
Date: March 1, 2001
Creator: Batcheller, Thomas Aquinas; Huestis, Gary Michael & Bolton, Steven Michael
Partner: UNT Libraries Government Documents Department
open access

Overview of the InterGroup protocols

Description: Existing reliable ordered group communication protocols have been developed for local-area networks and do not, in general, scale well to large numbers of nodes and wide-area networks. The InterGroup suite of protocols is a scalable group communication system that introduces a novel approach to handling group membership, and supports a receiver-oriented selection of service. The protocols are intended for a wide-area network, with a large number of nodes, that has highly variable delays and a high message loss rate, such as the Internet. The levels of the message delivery service range from unreliable unordered to reliable group timestamp ordered.
Date: March 1, 2001
Creator: Berket, Karlo; Agarwal, Deborah A.; Melliar-Smith, P. Michael & Moser, Louise E.
Partner: UNT Libraries Government Documents Department
open access

Southern Idaho Wildlife Mitigation Implementation 2000 Annual Report.

Description: This report covers calendar year 2000 activities for the Southern Idaho Wildlife Mitigation Implementation project. This project, implemented by Idaho Department of Fish and Game and Shoshone Bannock Tribes wildlife mitigation staff, is designed to protect, enhance and maintain wildlife habitats to mitigate construction losses for Palisades, Anderson Ranch, Black Canyon and Minidoka hydroelectric projects. Additional project information is available in the quarterly reports.
Date: March 1, 2001
Creator: Bottum, Edward & Mikkelsen, Anders
Partner: UNT Libraries Government Documents Department
open access

Shaft Sinking at the Nevada Test Site, U1h Shaft Project

Description: The U1h Shaft Project is a design/build subcontract to construct one 6.1 meter (m) (20 feet (ft)) finished diameter shaft to a depth of 321.6 m (1,055 ft.) at the Nevada Test Site. Atkinson Construction was subcontracted by Bechtel Nevada to construct the U1h Shaft for the U.S. Department of Energy. The project consists of furnishing and installing the sinking plant, construction of the 321.6 m (1,055 ft.) of concrete lined shaft, development of a shaft station at a depth of 297.5 m (976 ft.), and construction of a loading pocket at the station. The outfitting of the shaft and installation of a new hoist may be incorporated into the project at a later date. This paper will describe the design phase, the excavation and lining operation, shaft station construction and the contractual challenges encountered on this project.
Date: March 1, 2001
Creator: Briggs, B. & Musick, R.
Partner: UNT Libraries Government Documents Department
open access

Excitation of Nucleon Resonances

Description: I discuss developments in the area of nucleon resonance excitation, both necessary and feasible, that would put our understanding of nucleon structure in the regime of strong QCD on a qualitatively new level. They involve the collection of high quality data in various channels, a more rigorous approach in the search for ''missing'' resonances, an effort to compute some critical quantities in nucleon resonance excitations from first principles, i.e. QCD, and a proposal focused to obtain an understanding of a fundamental quantity in nucleon structure.
Date: March 1, 2001
Creator: Burkert, Volker D.
Partner: UNT Libraries Government Documents Department
open access

INSENSITIVE HIGH-NITROGEN COMPOUNDS

Description: The conventional approach to developing energetic molecules is to chemically place one or more nitro groups onto a carbon skeleton, which is why the term ''nitration'' is synonymous to explosives preparation. The nitro group carries the oxygen that reacts with the skeletal carbon and hydrogen fuels, which in turn produces the heat and gaseous reaction products necessary for driving an explosive shock. These nitro-containing energetic molecules typically have heats of formation near zero and therefore most of the released energy is derived from the combustion process. Our investigation of the tetrazine, furazan and tetrazole ring systems has offered a different approach to explosives development, where a significant amount of the chemical potential energy is derived from their large positive heats of formation. Because these compounds often contain a large percentage of nitrogen atoms, they are usually regarded as high-nitrogen fuels or explosives. A general artifact of these high-nitrogen compounds is that they are less sensitive to initiation (e.g. by impact) when compared to traditional nitro-containing explosives of similar performances. Using the precursor, 3,6-bis-(3,5-dimethylpyrazol-1-yl)-s-tetrazine, several useful energetic compounds based on the s-tetrazine system have been synthesized and studied. Some of the first compounds are 3,6-diamino-s-tetrazine-1,4-dioxide (LAX-112) and 3,6-dihydrazino-s-tetrazine (DHT). LAX-112 was once extensively studied as an insensitive explosive by Los Alamos; DHT is an example of a high-nitrogen explosive that relies entirely on its heat of formation for sustaining a detonation. Recent synthesis efforts have yielded an azo-s-tetrazine, 3,3'-azobis(6-amino-s-tetrazine) or DAAT, which has a very high positive heat of formation. The compounds, 4,4'-diamino-3,3'-azoxyfurazan (DAAF) and 4,4'-diamino-3,3'-azofurazan (DAAzF), may have important future roles in insensitive explosive applications. Neither DAAF nor DAAzF can be initiated by laboratory impact drop tests, yet both have in some aspects better explosive performances than 1,3,5-triamino-2,4,6-trinitrobenzene TATB--the standard of insensitive high explosives. The thermal stability of DAAzF is …
Date: March 1, 2001
Creator: CHAVEZ, D. & AL, ET
Partner: UNT Libraries Government Documents Department
open access

THE ORIGIN OF ALL COSMIC RAYS: A SPACE-FILLING MECHANISM

Description: There is a need for one mechanism to accelerate cosmic rays universally over the full energy spectrum, isotropically, and space filling. The current view is a theory based upon a series of mechanisms, patched to fit various spectral regions with a mechanism for the origin of the UHCRs still in doubt. We suggest that the reconnection of force-free magnetic fields produced by the twisting of all imbedded magnetic flux by the vorticity motion of all accretion or condensations both within the Galaxy as well as the metagalaxy is the universal mechanism. This leads to the acceleration of all cosmic rays with both total energy and individual energies up to the highest observed of 3 x 10{sup 20} ev and predicting an upper limit of 10{sup 23} ev. There are three primary, and we believe compelling reasons for adopting this different view of the origin of CRs. (1) The energy source is space filling and isotropic, thereby avoiding any anisotropy's due to single sources, e.g., supernovae remnants and AGN. (2) The galactic and particularly the extragalactic energy source is sufficient to supply the full energy of a universal galactic and extragalactic spectrum of 10{sup 60} to 10{sup 61} ergs sufficient to avoid the GZK cut-off. (3) Efficient E{sub parallel} acceleration from reconnection of force-free fields is well observed in the laboratory whereas collisionless shock acceleration still eludes laboratory confirmation.
Date: March 1, 2001
Creator: COLGATE, S. & LI, H.
Partner: UNT Libraries Government Documents Department
open access

USING MULTITAIL NETWORKS IN HIGH PERFORMANCE CLUSTERS

Description: Using multiple independent networks (also known as rails) is an emerging technique to overcome bandwidth limitations and enhance fault-tolerance of current high-performance clusters. We present and analyze various venues for exploiting multiple rails. Different rail access policies are presented and compared, including static and dynamic allocation schemes. An analytical lower bound on the number of networks required for static rail allocation is shown. We also present an extensive experimental comparison of the behavior of various allocation schemes in terms of bandwidth and latency. Striping messages over multiple rails can substantially reduce network latency, depending on average message size, network load and allocation scheme. The methods compared include a static rail allocation, a round-robin rail allocation, a dynamic allocation based on local knowledge, and a rail allocation that reserves both end-points of a message before sending it. The latter is shown to perform better than other methods at higher loads: up to 49% better than local-knowledge allocation and 37% better than the round-robin allocation. This allocation scheme also shows lower latency and it saturates on higher loads (for messages large enough). Most importantly, this proposed allocation scheme scales well with the number of rails and message sizes.
Date: March 1, 2001
Creator: COLL, S.; FRACHTEMBERG, E.; PETRINI, F.; HOISIE, A. & GURVITS, L.
Partner: UNT Libraries Government Documents Department
open access

SOFTWARE TOOLS THAT ADDRESS HAZARDOUS MATERIAL ISSUES DURING NUCLEAR FACILITY D and D

Description: The 49-year-old Chemistry and Metallurgy Research (CMR) Facility is where analytical chemistry and metallurgical studies on samples of plutonium and nuclear materials are conduct in support of the Department of Energy's nuclear weapons program. The CMR Facility is expected to be decontaminated and decommissioned (D and D) over the next ten to twenty years. Over the decades, several hazardous material issues have developed that need to be address. Unstable chemicals must be properly reassigned or disposed of from the workspace during D and D operation. Materials that have critical effects that are primarily chronic in nature, carcinogens, reproductive toxin, and materials that exhibit high chronic toxicity, have unique decontamination requirements, including the decontrolling of areas where these chemicals were used. Certain types of equipment and materials that contain mercury, asbestos, lead, and polychlorinated biphenyls have special provisions that must be addressed. Utilization of commercially available software programs for addressing hazardous material issues during D and D operations such as legacy chemicals and documentation are presented. These user-friendly programs eliminate part of the tediousness associated with the complex requirements of legacy hazardous materials. A key element of this approach is having a program that inventories and tracks all hazardous materials. Without an inventory of chemicals stored in a particular location, many important questions pertinent to D and D operations can be difficult to answer. On the other hand, a well-managed inventory system can address unstable and highly toxic chemicals and hazardous material records concerns before they become an issue. Tapping into the institutional database provides a way to take advantage of the combined expertise of the institution in managing a cost effective D and D program as well as adding a quality assurance element to the program. Using laboratory requirements as a logic flow diagram, quality and cost effective methods are …
Date: March 1, 2001
Creator: COURNOYER, M. & GRUNDEMANN, R.
Partner: UNT Libraries Government Documents Department
open access

Femtosecond x-rays from Thomson scattering using laser wakefield accelerators

Description: The possibility of producing femtosecond x-rays through Thomson scattering high power laser beams off laser wakefield generated relativistic electron beams is discussed. The electron beams are produced with either a self-modulated laser wakefield accelerator (SM-LWFA) or through a standard laser wakefield accelerator (LWFA) with optical injection. For a SM-LWFA (LWFA) produced electron beam, a broad (narrow) energy distribution is assumed, resulting in X-ray spectra that are broadband (monochromatic). Designs are presented for 3-100 fs x-ray pulses and the expected flux and brightness of these sources are compared.
Date: March 1, 2001
Creator: Catravas, P.; Esarey, E. & Leemans, W. P.
Partner: UNT Libraries Government Documents Department
open access

Washington Phase II Fish Diversion Screen Evaluations in the Yakima River Basin, 2000.

Description: Pacific Northwest National Laboratory (PNNL) evaluated 21 Phase II screen sites in the Yakima River Basin as part of a multi-year study for the Bonneville Power Administration (BPA) on the effectiveness of fish screening devices. The sites were examined in 1997, 1998, 1999, and 2000 to determine if they were being effectively operated and maintained to provide fish a safe, efficient return to the Yakima River. Data were collected to determine if velocities in front of the screens and in the bypass met current National Marine Fisheries Service (NMFS) criteria to promote safe and timely fish bypass and whether bypass outfall conditions allowed fish to safely return to the river. Based on the results of our studies in 2000, we conclude that: in general, water velocity conditions at the screen sites met fish passage criteria set forth by the NMFS; most facilities were efficiently protecting juvenile fish from entrainment, impingement, or migration delay; automated cleaning brushes generally functioned properly; chains and other moving parts were well greased and inoperative; and removal of sediment build-up and accumulated leafy and woody debris are areas that continue to improve.
Date: March 1, 2001
Creator: Chamness, M. A.
Partner: UNT Libraries Government Documents Department
open access

Effects of buoyancy on the flowfields of lean premixed turbulentv-flames

Description: Open laboratory turbulent flames used for investigating fundament flame turbulence interactions are greatly affected by buoyancy. Though much of our current knowledge is based on observations made in these open flames, the effects of buoyancy are usually not included in data interpretation, numerical analysis or theories. This inconsistency remains an obstacle to merging experimental observations and theoretical predictions. To better understanding the effects of buoyancy, our research focuses on steady lean premixed flames propagating in fully developed turbulence. We hypothesize that the most significant role of buoyancy forces on these flames is to influence their flowfields through a coupling with mean and fluctuating pressure fields. Changes in flow pattern alter the mean aerodynamic stretch and in turn affect turbulence fluctuation intensities both upstream and downstream of the flame zone. Consequently, flame stabilization, reaction rates, and turbulent flame processes are all affected. This coupling relates to the elliptical problem that emphasizes the importance of the upstream, wall and downstream boundary conditions in determining all aspects of flame propagation. Therefore, buoyancy has the same significance as other parameters such as flow configuration, flame geometry, means of flame stabilization, flame shape, enclosure size, mixture conditions, and flow conditions.
Date: March 1, 2001
Creator: Cheng, R. K.; Bedat, B.; Yegian, D. T. & Greenberg, P.
Partner: UNT Libraries Government Documents Department
open access

Wigwam River Juvenile Bull Trout and Fish Habitat Monitoring Program : 2000 Data Report.

Description: The Wigwam River bull trout (Salvelinus confluentus) and fish habitat monitoring program is a trans-boundary initiative implemented by the British Columbia Ministry of Environment, Lands and Parks (MOE), in cooperation with Bonneville Power Administration (BPA). The Wigwam River is an important fisheries stream located in southeastern British Columbia that supports healthy populations of both bull trout and Westslope cutthroat trout (Figure 1.1). This river has been characterized as the single most important bull trout spawning stream in the Kootenay Region (Baxter and Westover 2000, Cope 1998). In addition, the Wigwam River supports some of the largest Westslope cutthroat trout (Oncorhynchus clarki lewisi) in the Kootenay Region. These fish are highly sought after by anglers (Westover 1999a, 1999b). Bull trout populations have declined in many areas of their range within Montana and throughout the northwest including British Columbia. Bull trout were blue listed as vulnerable in British Columbia by the B.C. Conservation Data Center (Cannings 1993) and although there are many healthy populations of bull trout in the East Kootenays they remain a species of special concern. Bull trout in the United States portion of the Columbia River were listed as threatened in 1998 under the Endangered Species Act by the U.S. Fish and Wildlife Service. The upper Kootenay River is within the Kootenai sub-basin of the Mountain Columbia Province, one of the eleven Eco-provinces that make up the Columbia River Basin. MOE applied for and received funding from BPA to assess and monitor the status of wild, native stocks of bull trout in tributaries to Lake Koocanusa (Libby Reservoir) and the upper Kootenay River. This task is one of many that was undertaken to ''Monitor and Protect Bull Trout for Koocanusa Reservoir'' (BPA Project Number 2000-04-00).
Date: March 1, 2001
Creator: Cope, R. S. & Morris, K. J.
Partner: UNT Libraries Government Documents Department
open access

Predicting the emission rate of volatile organic compounds from vinyl flooring

Description: A model for predicting the rate at which a volatile organic compound (VOC) is emitted from a diffusion-controlled material is validated for three contaminants (n-pentadecane, n-tetradecane, and phenol) found in vinyl flooring (VF). Model parameters are the initial VOC concentration in the material-phase (C{sub 0}), the material/air partition coefficient (K), and the material-phase diffusion coefficient (D). The model was verified by comparing predicted gas-phase concentrations to data obtained during small-scale chamber tests, and by comparing predicted material-phase concentrations to those measured at the conclusion of the chamber tests. Chamber tests were conducted with the VF placed top side up and bottom side up. With the exception of phenol, and within the limits of experimental precision, the mass of VOCs recovered in the gas phase balances the mass emitted from the material phase. The model parameters (C{sub 0}, K, and D) were measured using procedures that were completely independent of the chamber test. Gas- and material-phase predictions compare well to the bottom-side-up chamber data. The lower emission rates for the top-side-up orientation may be explained by the presence of a low-permeability surface layer. The sink effect of the stainless steel chamber surface was shown to be negligible.
Date: March 1, 2001
Creator: Cox, Steven S.; Little, John C. & Hodgson, Alfred T.
Partner: UNT Libraries Government Documents Department
Back to Top of Screen