303 Matching Results

Search Results

Advanced search parameters have been applied.

Ensemble Single Column Modeling in the Tropics - Derivation of observed forcing data sets, estimation of observation uncertainty and application to parametrization improvements

Description: The project was carried out in 2 distinct phases. In the first phase we established the general validity of using an ensemble approach to Single Column Modeling (SCM) using the Manus and Nauru sites. In the second phase we derived and applied an ensemble forcing derivation technique to observations. First we derived an ensemble forcing estimate for the TWP-ICE experiment and then by extended the method to provide three wet seasons of “continuous ensemble forcing” for the Darwin site. The main purpose of using ensemble techniques in SCM simulation is to be able to assess how much of the overall SCM simulation error is due to model errors and how much due to errors in the forcing.
Date: July 6, 2012
Creator: Jakob, Christian
Partner: UNT Libraries Government Documents Department

Collaborative Proposal: Transforming How Climate System Models are Used: A Global, Multi-Resolution Approach

Description: Despite the great interest in regional modeling for both weather and climate applications, regional modeling is not yet at the stage that it can be used routinely and effectively for climate modeling of the ocean. The overarching goal of this project is to transform how climate models are used by developing and implementing a robust, efficient, and accurate global approach to regional ocean modeling. To achieve this goal, we will use theoretical and computational means to resolve several basic modeling and algorithmic issues. The first task is to develop techniques for transitioning between parameterized and high-fidelity regional ocean models as the discretization grid transitions from coarse to fine regions. The second task is to develop estimates for the error in scientifically relevant quantities of interest that provide a systematic way to automatically determine where refinement is needed in order to obtain accurate simulations of dynamic and tracer transport in regional ocean models. The third task is to develop efficient, accurate, and robust time-stepping schemes for variable spatial resolution discretizations used in regional ocean models of dynamics and tracer transport. The fourth task is to develop frequency-dependent eddy viscosity finite element and discontinuous Galerkin methods and study their performance and effectiveness for simulation of dynamics and tracer transport in regional ocean models. These four projects share common difficulties and will be approach using a common computational and mathematical toolbox. This is a multidisciplinary project involving faculty and postdocs from Colorado State University, Florida State University, and Penn State University along with scientists from Los Alamos National Laboratory. The completion of the tasks listed within the discussion of the four sub-projects will go a long way towards meeting our goal of developing superior regional ocean models that will transform how climate system models are used.
Date: April 15, 2013
Creator: Estep, Donald
Partner: UNT Libraries Government Documents Department

Final Technical Report [Carbon Data Assimilation with a Coupled Ensemble Kalman Filter]

Description: We proposed (and accomplished) the development of an Ensemble Kalman Filter (EnKF) approach for the estimation of surface carbon fluxes as if they were parameters, augmenting the model with them. Our system is quite different from previous approaches, such as carbon flux inversions, 4D-­‐Var, and EnKF with approximate background error covariance (Peters et al., 2008). We showed (using observing system simulation experiments, OSSEs) that these differences lead to a more accurate estimation of the evolving surface carbon fluxes at model grid-­‐scale resolution. The main properties of the LETKF-­‐C are: a) The carbon cycle LETKF is coupled with the simultaneous assimilation of the standard atmospheric variables, so that the ensemble wind transport of the CO2 provides an estimation of the carbon transport uncertainty. b) The use of an assimilation window (6hr) much shorter than the months-­‐long windows used in other methods. This avoids the inevitable “blurring” of the signal that takes place in long windows due to turbulent mixing since the CO2 does not have time to mix before the next window. In this development we introduced new, advanced techniques that have since been adopted by the EnKF community (Kang, 2009, Kang et al., 2011, Kang et al. 2012). These advances include “variable localization” that reduces sampling errors in the estimation of the forecast error covariance, more advanced adaptive multiplicative and additive inflations, and vertical localization based on the time scale of the processes. The main result has been obtained using the LETKF-­‐C with all these advances, and assimilating simulated atmospheric CO2 observations from different observing systems (surface flask observations of CO2 but no surface carbon fluxes observations, total column CO2 from GoSAT/OCO-­‐2, and upper troposphere AIRS retrievals). After a spin-­‐up of about one month, the LETKF-­‐C succeeded in reconstructing the true evolving surface fluxes of carbon at a model grid ...
Date: August 30, 2013
Creator: Kalnay, Eugenia
Partner: UNT Libraries Government Documents Department

Three-Dimensional Imaging and Quantification of Biomass and Biofilms in Porous Media

Description: A new method to resolve biofilms in three dimensions in porous media using high-resolution synchrotron-based x-ray computed microtomography (CMT) has been developed. Imaging biofilms in porous media without disturbing the natural spatial arrangement of the porous media and associated biofilm has been a challenging task, primarily because porous media generally precludes conventional imaging via optical microscopy; x-ray tomography offers a potential alternative. One challenge for using this method is that most conventional x-ray contrast agents are water-soluble and easily diffuse into biofilms. To overcome this problem, silver-coated microspheres were added to the fluid phase to create an x-ray contrast that does not diffuse into the biofilm mass. Using this approach, biofilm imaging in porous media was accomplished with sufficient contrast to differentiate between the biomass- and fluid-filled pore spaces. The method was validated by using a two-dimensional micro-model flow cell where both light microscopy and CMT imaging were used to im age the biofilm. The results of this work has been published in Water Resources Research (Iltis et al., 2010). Additional work needs to be done to optimize this imaging approach, specifically, we find that the quality of the images are highly dependent on the coverage of the biofilm with Ag particles, - which means that we may have issues in dead-end pore space and for very low density (fluffy) biofilms. What we can image for certain with this technique is the biofilm surface that is well-connected to flow paths and thus well-supplied with nutrients etc.
Date: October 10, 2012
Creator: Wildenschild, Dorthe
Partner: UNT Libraries Government Documents Department

ARM Climate Research Facility Quarterly Value-Added Product Report Third Quarter: April 01–June 30, 2011

Description: The purpose of this report is to provide a concise status update for value-added products (VAP) implemented by the Atmospheric Radiation Measurement Climate Research Facility. The report is divided into the following sections: (1) new VAPs for which development has begun, (2) progress on existing VAPs, (3) future VAPs that have been recently approved, (4) other work that leads to a VAP, and (5) top requested VAPs from the archive
Date: August 18, 2011
Creator: Sivaraman, C
Partner: UNT Libraries Government Documents Department

Atmosphere-Land-Surface Interaction over the Southern Great Plains: Diagnosis of Mechanisms from SGP ARM Data

Description: Work reported included analysis of pentad (5 day) averaged data, proposal of a hypothesis concerning the key role of the Atlantic Multi-decadal Oscillation in 20th century drought and wet periods over the Great Plains, analysis of recurrent super-synoptic evolution of the Great Plains low-level jet, and study of pentad evolution of the 1988 drought and 1993 flood over the Great Plains from a NARR perspective on the atmospheric and terrestrial water balance.
Date: February 1, 2013
Creator: Nigam, Sumant
Partner: UNT Libraries Government Documents Department

Final Report

Description: The goal of this project was to quantify organic aerosol precursor concentrations in an urban environment and to measure suitable organic photoproduct species that can act as tracers of photochemical processing to identify the occurrence and rate of secondary organic aerosol formation. Field measurements were made as part of the ASR field program Carbonaceous Aerosols and Radiative Effects Study (CARES) in June 2010. What is new in our approach is the measurement for the total concentration of long chain alkanes (>C10) and heavier alkyl substituted aromatics associated with diesel exhaust gas phase organic compound emissions. A method to measure these so called intermediate volatility organic compounds (IVOCs) was developed by modifying a proton transfer reaction mass spectrometer instrument to perform both volatile organic compound (VOC) and IVOC analysis by thermal desorption from a Tenax adsorbent trap (TD-PTR-MS). Lab and field results show that the TD-PTR-MS technique can measure long chain alkanes associated with diesel engine emissions and thus provide a novel means to measure these compounds to better understand the impact of vehicle emissions on secondary organic aerosol formation.
Date: August 20, 2012
Partner: UNT Libraries Government Documents Department

Clinical trials of boron neutron capture therapy [in humans] [at Beth Israel Deaconess Medical Center][at Brookhaven National Laboratory]

Description: Assessment of research records of Boron Neutron Capture Therapy was conducted at Brookhaven National Laboratory and Beth Israel Deaconess Medical Center using the Code of Federal Regulations, FDA Regulations and Good Clinical Practice Guidelines. Clinical data were collected from subjects' research charts, and differences in conduct of studies at both centers were examined. Records maintained at Brookhaven National Laboratory were not in compliance with regulatory standards. Beth Israel's records followed federal regulations. Deficiencies discovered at both sites are discussed in the reports.
Date: May 29, 2001
Creator: Wallace, Christine
Partner: UNT Libraries Government Documents Department

India's pulp and paper industry: Productivity and energy efficiency

Description: Historical estimates of productivity growth in India's pulp and paper sector vary from indicating an improvement to a decline in the sector's productivity. The variance may be traced to the time period of study, source of data for analysis, and type of indices and econometric specifications used for reporting productivity growth. The authors derive both statistical and econometric estimates of productivity growth for this sector. Their results show that productivity declined over the observed period from 1973-74 to 1993-94 by 1.1% p.a. Using a translog specification the econometric analysis reveals that technical progress in India's pulp and paper sector has been biased towards the use of energy and material, while it has been capital and labor saving. The decline in productivity was caused largely by the protection afforded by high tariffs on imported paper products and other policies, which allowed inefficient, small plants to enter the market and flourish. Will these trends continue into the future, particularly where energy use is concerned? The authors examine the current changes in structure and energy efficiency undergoing in the sector. Their analysis shows that with liberalization of the sector, and tighter environmental controls, the industry is moving towards higher efficiency and productivity. However, the analysis also shows that because these improvements are being hampered by significant financial and other barriers the industry might have a long way to go.
Date: July 1, 1999
Creator: Schumacher, Katja
Partner: UNT Libraries Government Documents Department

The NABIR Strategic Plan 2001

Description: For more than 50 years, the U.S. created a vast network of more than 113 facilities for research, development, and testing of nuclear materials. As a result of these activities, subsurface contamination has been identified at over 7,000 discrete sites across the U.S. Department of Energy (DOE) complex. With the end of the Cold War threat, the DOE has shifted its emphasis to remediation, decommissioning, and decontamination of the immense volumes of contaminated groundwater, sediments, and structures at its sites. DOE is currently responsible for remediating 1.7 trillion gallons of contaminated groundwater, an amount equal to approximately four times the daily U.S. water consumption, and 40 million cubic meters of contaminated soil, enough to fill approximately 17 professional sports stadiums. It is estimated that more than 60% of DOE facilities have groundwater contaminated with metals or radionuclides. The only contaminant that appears more often than metal and radionuclide contaminants in groundwater is chlorinated hydrocarbons. More than 50% of all soil and sediments at DOE facilities are contaminated with metal and radionuclides, the contaminants found with the highest frequency in soil at all DOE waste sites. Indeed, while virtually all of the contaminants found at industrial sites nationwide can also be found at DOE sites, many of the metals and especially the radionuclides found on DOE sites are unique to those sites. Current technology for treatment of groundwater contaminated with metals and/or radionuclides is ''pump and treat,'' followed by disposal or reinjection of treated water. This process can be costly and inefficient due to the difficulty of completely removing the contaminated groundwater and sorption of contaminants on mineral surfaces. DOE's Office of Environmental Management (EM), which is responsible for the cleanup, has stated that advances in science and technology are critical for DOE to reduce costs and successfully address these long-term ...
Date: October 22, 2001
Partner: UNT Libraries Government Documents Department

Comparative Genomics and Evolution of Eukaryotic Phospholipid biosynthesis

Description: Phospholipid biosynthetic enzymes produce diverse molecular structures and are often present in multiple forms encoded by different genes. This work utilizes comparative genomics and phylogenetics for exploring the distribution, structure and evolution of phospholipid biosynthetic genes and pathways in 26 eukaryotic genomes. Although the basic structure of the pathways was formed early in eukaryotic evolution, the emerging picture indicates that individual enzyme families followed unique evolutionary courses. For example, choline and ethanolamine kinases and cytidylyltransferases emerged in ancestral eukaryotes, whereas, multiple forms of the corresponding phosphatidyltransferases evolved mainly in a lineage specific manner. Furthermore, several unicellular eukaryotes maintain bacterial-type enzymes and reactions for the synthesis of phosphatidylglycerol and cardiolipin. Also, base-exchange phosphatidylserine synthases are widespread and ancestral enzymes. The multiplicity of phospholipid biosynthetic enzymes has been largely generated by gene expansion in a lineage specific manner. Thus, these observations suggest that phospholipid biosynthesis has been an actively evolving system. Finally, comparative genomic analysis indicates the existence of novel phosphatidyltransferases and provides a candidate for the uncharacterized eukaryotic phosphatidylglycerol phosphate phosphatase.
Date: December 1, 2006
Creator: Lykidis, Athanasios
Partner: UNT Libraries Government Documents Department

Introduction to special section on Hydrologic Synthesis

Description: The Hydrological Synthesis special section presentssynthesis topics that have the potential to revolutionize hydrologicalsciences in a manner needed to meet critical water challenges that we nowface. The special section also highlights topics that are important andexciting enough to compel researchers to engage in collaborativesynthesis activities. This introductory paper provides a brief overviewof nine papers that are included in this special section, which discussthe synthesis of tools, data, concepts, theories, or approaches acrossdisciplines and scales. The wide range of topics that are exploredinclude groundwater quality, river restoration, water management,nitrogen cycling, and Earth surface dynamics. Collectively, the specialsection papers illustrate that the challenge to deal effectively withcomplex water problems is not purely a scientific, technological, orsocioeconomic one; it is instead a complex, 21st century problem thatrequires coordinated synthesis.
Date: January 23, 2006
Creator: Hubbard, Susan
Partner: UNT Libraries Government Documents Department

Global land use data for integrated assessment modeling

Description: Changes in land use and land cover have been one of the major drivers of global change over the last three centuries. Detailed spatially-explicit data sets characterizing these historical land cover changes are now emerging. By synthesizing remotely-sensed land cover classification data sets with historical land use census data, our research group has developed comprehensive databases of historical land use and land cover change. Moreover, we are building estimates of the land suitability for agriculture to predict the constraints on future land use. In this project, we have interacted with the Global Trade and Analysis Project (GTAP) at Purdue University, to adapt our land use data for use with the GTAP database, a baseline database widely used by the integrated assessment modeling community. Moreover, we have developed an interactive website for providing these newly emerging land use data products for the integrated assessment (IA) community and to the climate modeling community.
Date: December 12, 2005
Creator: Ramankutty, Navin
Partner: UNT Libraries Government Documents Department

Selected Translated Abstracts of Chinese-Language Climate Change Publications

Description: This report contains English-translated abstracts of important Chinese-language literature concerning global climate change for the years 1995-1998. This body of literature includes the topics of adaptation, ancient climate change, climate variation, the East Asia monsoon, historical climate change, impacts, modeling, and radiation and trace-gas emissions. In addition to the biological citations and abstracts translated into English, this report presents the original citations and abstracts in Chinese. Author and title indexes are included to assist the reader in locating abstracts of particular interest.
Date: May 1, 1999
Creator: Cushman, R.M. & Burtis, M.D.
Partner: UNT Libraries Government Documents Department

Bioremediation of Metals and Radionuclides: What It Is and How It Works (2nd Edition)

Description: This primer is intended for people interested in environmental problems of the U.S. Department of Energy (DOE) and in their potential solutions. It will specifically look at some of the more hazardous metal and radionuclide contaminants found on DOE lands and at the possibilities for using bioremediation technology to clean up these contaminants. The second edition of the primer incorporates recent findings by researchers in DOE's Natural and Accelerated Bioremediation Research (NABIR) Program. Bioremediation is a technology that can be used to reduce, eliminate, or contain hazardous waste. Over the past two decades, it has become widely accepted that microorganisms, and to a lesser extent plants, can transform and degrade many types of contaminants. These transformation and degradation processes vary, depending on the physical-chemical environment, microbial communities, and nature of the contaminant. This technology includes intrinsic bioremediation, which relies on naturally occurring processes, and accelerated bioremediation, which enhances microbial degradation or transformation through the addition of nutrients (biostimulation) or inoculation with microorganisms (bioaugmentation). Over the past few years, interest in bioremediation has increased. It has become clear that many organic contaminants such as hydrocarbon fuels can be degraded to relatively harmless products such as CO{sub 2} (the end result of the degradation process). Waste water managers and scientists have also found that microorganisms can interact with metals and convert them from one chemical form to another. Laboratory tests and ex situ bioremediation applications have shown that microorganisms can change the valence, or oxidation state, of some heavy metals (e.g., chromium and mercury) and radionuclides (e.g., uranium) by using them as electron acceptors. In some cases, the solubility of the altered species decreases and the contaminant is immobilized in situ, i.e., precipitated into an insoluble salt in the sediment. In other cases, the opposite occurs--the solubility of the altered species increases, ...
Date: September 30, 2003
Creator: Palmisano, Anna & Hazen, Terry
Partner: UNT Libraries Government Documents Department

ZioLib: A parallel I/O library

Description: In a distributed memory parallel environment, many applications rely on a serial I/O strategy, where the global array is gathered on a single MPI process and then written out to a file. I/O performance with this approach is largely limited by single process I/O bandwidth. Even when parallel I/O is used, satisfactory parallel scaling is not always observed. It is because in many applications fields are not necessarily in a most favorable parallel decomposition for I/O. The best I/O rates are obtained when a field is decomposed with respect to the array's last dimension (referred to here as Z). Another situation often encountered in many applications is that a field in CPU resident memory is in one index order but must be stored in a disk file in another order. Changing index orders can complicate a parallel I/O implementation and slow down I/O. ZioLib facilitates an efficient parallel I/O for arrays in such situations. In case of a write, ZioLib remaps a distributed field into a Z-decomposition on a subset of processes (which will be called the I/O staging processes) and from there writes to a disk file in parallel. In this Z-decomposition, the data layout of the remapped array on the staging processes memory is the same as on disk, thus only block data transfer occurs during parallel I/O, achieving maximum efficiency. In case of a read the steps are reversed to build the required distributed arrays on the computational processes.
Date: August 1, 2003
Creator: Yang, Woo-Sun & Ding, Chris
Partner: UNT Libraries Government Documents Department

Monitoring microbe-induced physical property changes using high-frequency acoustic waveform data: Toward the development of a microbial megascope

Description: A laboratory investigation was undertaken to determine the effect of microbe generated gas bubbles in controlled, saturated sediment columns utilizing a novel technique involving acoustic wave propagation. Specifically, the effect of denitrifying bacteria on saturated flow conditions was evaluated in light of the stimulated production of N{sub 2} gas and the resulting plugging of the pore throats. The propagation of high frequency acoustic waves through the sediment columns was used to locate those regions in the column where gas accumulation occurred. Over a period of six weeks, regions of gas accumulation resulted in the attenuation of acoustic wave energies with the decreases in amplitude typically greater than one order of magnitude.
Date: May 20, 2002
Creator: Williams, Kenneth Hurst
Partner: UNT Libraries Government Documents Department

MPH: A library for distributed multi-component environment

Description: Many current large and complex HPC applications are based on semi-independent program components developed by different groups or for different purposes. On distributed memory parallel supercomputers, how to perform component-name registration and initialize communications between independent components are among the first critical steps in establishing a distributed multi-component environment. Here we describe MPH, a multi-component handshaking library that resolves these tasks in a convenient and consistent way. MPH uses MPI for high performance and supports many PVM functionality. It supports two major parallel integration mechanism: multi-component multi-executable (MCME) and multi-component single-executable (MCME). It is a simple, easy-to-use module for developing practical codes, or as basis for larger software tools/frameworks.
Date: June 1, 2001
Creator: Ding, Chris H.Q. & He, Yun
Partner: UNT Libraries Government Documents Department

Multi Program-Components Handshaking (MPH) Utility Version 3 User's Manual

Description: MPH version 2 combines all features of MPH version 1, unifies the interfaces, and provides more flexible components integration/execution modes. In a distributed multi-component environment, each executable resides on a set of SMP nodes. Components within an executable may overlap on different nodes or processors. MPH Version 2 contains the following functionality: component name registration; resource allocation; multi-component single executable, multi-component multi-executable, etc.; inter-component communication; inquiry on the multi-component environment; and standard in/out redirect.
Date: June 20, 2002
Creator: He, Yun (Helen) & Ding, Chris H.Q.
Partner: UNT Libraries Government Documents Department

DOE-NABIR PI Workshop: Abstracts January 31-February 2, 2000

Description: The mission of the NABIR program is to provide the scientific understanding needed to use natural processes and to develop new methods to accelerate those processes for the bioremediation of contaminated soils, sediments and groundwater at U.S. Department of Energy (DOE) facilities. The program is implemented through seven interrelated scientific research elements (Assessment, Bacterial Transport, Biogeochemical Dynamics, Bimolecular Science and Engineering, Biotransformation and Biodegradation, Community Dynamics/Microbial Ecology and System Engineering, Integration, Prediction and Optimization); and through an element called Bioremediation and its Societal Implications and Concerns (BASIC), which addresses societal issues and concerns of stakeholders through communication and collaboration among all relevant groups, including community leaders and representatives, engineers, scientists, lawyers, etc. The initial emphasis of NABIR program research is on the bioremediation of metals and radionuclides in the subsurface below the root zone, including both thick vadose and saturated zones. The material presented at this year's workshop focuses on research funded in FY 1998-2000 by DOE's Office of Science through its Office of Biological and Environmental Research. Sixty-eight projects have been funded in the scientific program elements, and two have been funded in the BASIC program. Abstracts of these programs are summarized in this booklet, along with abstracts of other DOE programs related to research in the NABIR program.
Date: January 1, 2000
Creator: Pratt, Mary (ed.)
Partner: UNT Libraries Government Documents Department

A ghost cell expansion method for reducing communications in solving PDE problems

Description: In solving Partial Differential Equations, such as the Barotropic equations in ocean models, on Distributed Memory Computers, finite difference methods are commonly used. Most often, processor subdomain boundaries must be updated at each time step. This boundary update process involves many messages of small sizes, therefore large communication overhead. Here we propose a new approach which expands the ghost cell layers and thus updates boundaries much less frequently ---reducing total message volume and grouping small messages into bigger ones. Together with a technique for eliminating diagonal communications, the method speedup communication substantially, up to 170%. We explain the method and implementation in details, provide systematic timing results and performance analysis on Cray T3E and IBM SP.
Date: May 1, 2001
Creator: Ding, Chris H.Q. & He, Yun
Partner: UNT Libraries Government Documents Department