33 Matching Results

Search Results

Advanced search parameters have been applied.

Final Technical Report

Description: In this final technical report, a summary of work is provided. Concepts were developed for a new statistical cloud parameterization suitable for inclusion into global climate models. These concepts were evaluated by comparison to ARM data and data from cloud resolving models driven by ARM data. The purpose of this grant was to develop a new cloud parameterization for the global climate model of the Geophysical Fluid Dynamics Laboratory (GFDL) of the National Oceanic and Atmospheric Administration (NOAA). Note that uncertainties in cloud parameterizations are a key reason why prediction of climate change from climate models remain unacceptably uncertain. To develop the parameterizations, the observations and models provided by the Department of Energy's Atmospheric Radiation Measurement (ARM) program were analyzed and used.
Date: June 23, 2003
Creator: Klein, Stephen A.
Partner: UNT Libraries Government Documents Department

Using stochastically-generated subcolumns to represent cloud structure in a large-scale model

Description: A new method for representing subgrid-scale cloud structure, in which each model column is decomposed into a set of subcolumns, has been introduced into the Geophysical Fluid Dynamics Laboratory's global climate model AM2. Each subcolumn in the decomposition is homogeneous but the ensemble reproduces the initial profiles of cloud properties including cloud fraction, internal variability (if any) in cloud condensate, and arbitrary overlap assumptions that describe vertical correlations. These subcolumns are used in radiation and diagnostic calculations, and have allowed the introduction of more realistic overlap assumptions. This paper describes the impact of these new methods for representing cloud structure in instantaneous calculations and long-term integrations. Shortwave radiation computed using subcolumns and the random overlap assumption differs in the global annual average by more than 4 W/m{sup 2} from the operational radiation scheme in instantaneous calculations; much of this difference is counteracted by a change in the overlap assumption to one in which overlap varies continuously with the separation distance between layers. Internal variability in cloud condensate, diagnosed from the mean condensate amount and cloud fraction, has about the same effect on radiative fluxes as does the ad hoc tuning accounting for this effect in the operational radiation scheme. Long simulations with the new model configuration show little difference from the operational model configuration, while statistical tests indicate that the model does not respond systematically to the sampling noise introduced by the approximate radiative transfer techniques introduced to work with the subcolumns.
Date: December 8, 2005
Creator: Pincus, R; Hemler, R & Klein, S A
Partner: UNT Libraries Government Documents Department

Final Technical Report

Description: In this final technical report, a summary of work is provided. Work toward an improved representation of frontal clouds in global climate models occurred. This involved analysis of cloud variability in ARM observations and the careful contrast of single column model solutions with ARM data. In addition, high resolution simulations of frontal clouds were employed to diagnosis processes that are important for the development of frontal clouds.
Date: October 27, 2005
Creator: Klein, Stephen A.
Partner: UNT Libraries Government Documents Department

How might a statistical cloud scheme be coupled to a mass-flux convection scheme?

Description: The coupling of statistical cloud schemes with mass-flux convection schemes is addressed. Source terms representing the impact of convection are derived within the framework of prognostic equations for the width and asymmetry of the probability distribution function of total water mixing ratio. The accuracy of these source terms is quantified by examining output from a cloud resolving model simulation of deep convection. Practical suggestions for inclusion of these source terms in large-scale models are offered.
Date: September 27, 2004
Creator: Klein, Stephen A.; Pincus, Robert; Hannay, Cecile & Xu, Kuan-man
Partner: UNT Libraries Government Documents Department

Using cloud resolving model simulations of deep convection to inform cloud parameterizations in large-scale models

Description: Cloud parameterizations in large-scale models struggle to address the significant non-linear effects of radiation and precipitation that arise from horizontal inhomogeneity in cloud properties at scales smaller than the grid box size of the large-scale models. Statistical cloud schemes provide an attractive framework to self-consistently predict the horizontal inhomogeneity in radiation and microphysics because the probability distribution function (PDF) of total water contained in the scheme can be used to calculate these non-linear effects. Statistical cloud schemes were originally developed for boundary layer studies so extending them to a global model with many different environments is not straightforward. For example, deep convection creates abundant cloudiness and yet little is known about how deep convection alters the PDF of total water or how to parameterize these impacts. These issues are explored with data from a 29 day simulation by a cloud resolving model (CRM) of the July 1997 ARM Intensive Observing Period at the Southern Great Plains site. The simulation is used to answer two questions: (a) how well can the beta distribution represent the PDFs of total water relative to saturation resolved by the CRM? (b) how can the effects of convection on the PDF be parameterized? In addition to answering these questions, additional sections more fully describe the proposed statistical cloud scheme and the CRM simulation and analysis methods.
Date: June 23, 2003
Creator: Klein, Stephen A.; Pincus, Robert & Xu, Kuan-man
Partner: UNT Libraries Government Documents Department

TRNSYS for windows packages

Description: TRNSYS 14.1 was released in 1994. This package represents a significant step forward in usability due to several graphical utility programs for DOS. These programs include TRNSHELL, which encapsulates TRNSYS functions, PRESIM, which allows the graphical creation of a simulation system, and TRNSED, which allows the easy sharing of simulations. The increase in usability leads to a decrease in the time necessary to prepare the simulation. Most TRNSYS users operate on PC computers with the Windows operating system. Therefore, the next logical step in increased usability was to port the current TRNSYS package to the Windows operating system. Several organizations worked on this conversion that has resulted in two distinct Windows packages. One package closely resembles the DOS version and includes TRNSHELL for Windows and PRESIM for Windows. The other package incorporates a general front-end, called IISIBat, that is a general simulation tool front-end. 8 figs.
Date: September 1, 1996
Creator: Blair, N.J.; Beckman, W.A.; Klein, S.A. & Mitchell, J.W.
Partner: UNT Libraries Government Documents Department

Impact of a solar domestic hot water demand-side management program on an electric utility and its customers

Description: A methodology to assess the economic and environmental impacts of a large scale implementation of solar domestic hot water (SDHW) systems is developed. Energy, emission and demand reductions and their respective savings are quantified. It is shown that, on average, an SDHW system provides an energy reduction of about 3200 kWH, avoided emissions of about 2 tons and a capacity contribution of 0.7 kW to a typical Wisconsin utility that installs 5000 SDHW system. The annual savings from these reductions to utility is {dollar_sign}385,000, providing a return on an investment of over 20{percent}. It is shown that, on average, a consumer will save {dollar_sign}211 annually in hot water heating bills. 8 refs., 7 figs.
Date: September 1, 1996
Creator: Trzeniewski, J.; Mitchell, J.W.; Klein, S.A. & Beckman, W.A.
Partner: UNT Libraries Government Documents Department

Diagnosis of the summertime warm and dry bias over the U. S. Southern Great Plains in the GFDL climate model using a weather forecasting approach

Description: Weather forecasts started from realistic initial conditions are used to diagnose the large warm and dry bias over the United States Southern Great Plains simulated by the GFDL climate model. The forecasts exhibit biases in surface air temperature and precipitation within 3 days which appear to be similar to the climate bias. With the model simulating realistic evaporation but underestimated precipitation, a deficit in soil moisture results which amplifies the initial temperature bias through feedbacks with the land surface. The underestimate of precipitation is associated with an inability of the model to simulate the eastward propagation of convection from the front-range of the Rocky Mountains and is insensitive to an increase of horizontal resolution from 2{sup o} to 0.5{sup o} latitude.
Date: July 11, 2006
Creator: Klein, S A; Jiang, X; Boyle, J; Malyshev, S & Xie, S
Partner: UNT Libraries Government Documents Department

Tensegrity and its role in guiding engineering sciences in the development of bio-inspired materials.

Description: Tensegrity is the word coined by Buckminster Fuller as a contraction of tensional integrity. A tensegrity system is established when a set of discontinuous compressive components interacts with a set of continuous tensile components to define a stable volume in space. Tensegrity structures are mechanically stable not because of the strength of individual members but because of the way the entire structure distributes and balances mechanical loads. Tensile forces naturally transmit themselves over the shortest distance between two points, so the members of a tensegrity system are precisely positioned to best withstand stress. Thus, tensegrity systems offer a maximum amount of strength for a given amount of material. Man-made structures have traditionally been designed to avoid developing large tensile stresses. In contrast, nature always uses a balance of tension and compression. Tensegrity principles apply at essentially every size-scale in the human body. Macroscopically, the bones that constitute our skeleton are pulled up against the force of gravity and stabilized in a vertical form by the pull of tensile muscles, tendons and ligaments. Microscopically, a tensegrity structure has been proposed for the skeleton of cells. This report contains the results of a feasibility study and literature survey to explore the potential of applying tensegrity principles in designing materials with desired functionalities. The goal is to assess if further study of the principles of tensegrity may be exploited as an avenue for producing new materials that have intrinsic capabilities for adapting to changing loads (self-healing), as with the ongoing reconstruction of living bone under loading. This study contains a collection of literature that has been categorized into the areas of structures, mathematics, mechanics, and, biology. The topics addressed in each area are discussed. Ultimately, we conclude that because tensegrity is fundamentally a description of structure, it may prove useful for describing existing ...
Date: January 1, 2004
Creator: Pierce, David M.; Chen, Er-Ping & Klein, Patrick A.
Partner: UNT Libraries Government Documents Department

Long-term Observations of the Convective Boundary Layer Using Insect Radar Returns at the SGP ARM Climate Research Facility

Description: A long-term study of the turbulent structure of the convective boundary layer (CBL) at the U.S. Department of Energy Atmospheric Radiation Measurement Program (ARM) Southern Great Plains (SGP) Climate Research Facility is presented. Doppler velocity measurements from insects occupying the lowest 2 km of the boundary layer during summer months are used to map the vertical velocity component in the CBL. The observations cover four summer periods (2004-08) and are classified into cloudy and clear boundary layer conditions. Profiles of vertical velocity variance, skewness, and mass flux are estimated to study the daytime evolution of the convective boundary layer during these conditions. A conditional sampling method is applied to the original Doppler velocity dataset to extract coherent vertical velocity structures and to examine plume dimension and contribution to the turbulent transport. Overall, the derived turbulent statistics are consistent with previous aircraft and lidar observations. The observations provide unique insight into the daytime evolution of the convective boundary layer and the role of increased cloudiness in the turbulent budget of the subcloud layer. Coherent structures (plumes-thermals) are found to be responsible for more than 80% of the total turbulent transport resolved by the cloud radar system. The extended dataset is suitable for evaluating boundary layer parameterizations and testing large-eddy simulations (LESs) for a variety of surface and cloud conditions.
Date: August 20, 2009
Creator: Chandra, A S; Kollias, P; Giangrande, S E & Klein, S A
Partner: UNT Libraries Government Documents Department

Role of eastward propagating convection systems in the diurnal cycle and seasonal mean summertime rainfall over the U. S. Great Plains

Description: By diagnosing the 3-hourly North American Regional Reanalysis rainfall dataset for the 1979-2003 period, it is illustrated that the eastward propagation of convection systems from the Rockies to the Great Plains plays an essential role for the warm season climate over the central U.S. This eastward propagating mode could be the deciding factor for the observed nocturnal rainfall peak over the Great Plains. The results also suggest that nearly half of the total summer mean rainfall over this region is associated with these propagating convection systems. For instance, the extreme wet condition of the 1993 summer may be attributed to the frequent occurrence of propagating convection events and enhanced diurnal rainfall amplitude over the Great Plains. Thus, proper representation of this important propagating component in GCMs is essential for simulating the diurnal and seasonal mean characteristics of summertime rainfall over the central US.
Date: June 7, 2006
Creator: Jiang, X; Lau, N C & Klein, S A
Partner: UNT Libraries Government Documents Department

Critical Evaluation of the ISCCP Simulator Using Ground-Based Remote Sensing Data

Description: Given the known shortcomings in representing clouds in Global Climate Models (GCM) comparisons with observations are critical. The International Satellite Cloud Climatology Project (ISCCP) diagnostic products provide global descriptions of cloud top pressure and column optical depth that extends over multiple decades. The necessary limitations of the ISCCP retrieval algorithm require that before comparisons can be made between model output and ISCCP results the model output must be modified to simulate what ISCCP would diagnose under the simulated circumstances. We evaluate one component of the so-called ISCCP simulator in this study by comparing ISCCP and a similar algorithm with various long-term statistics derived from the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Climate Research Facility ground-based remote sensors. We find that were a model to simulate the cloud radiative profile with the same accuracy as can be derived from the ARM data, then the likelihood of that occurrence being placed in the same cloud top pressure and optical depth bin as ISCCP of the 9 bins that have become standard ranges from 30% to 70% depending on optical depth. While the ISCCP simulator improved the agreement of cloud-top pressure between ground-based remote sensors and satellite observations, we find minor discrepancies due to the parameterization of cloud top pressure in the ISCCP simulator. The primary source of error seems to be related to discrepancies in visible optical depth that are not accounted for in the ISCCP simulator. We show that the optical depth discrepancies are largest when the assumptions necessary for plane parallel radiative transfer optical depths retrievals are violated.
Date: November 2, 2009
Creator: Mace, G G; Houser, S; Benson, S; Klein, S A & Min, Q
Partner: UNT Libraries Government Documents Department

Comparison of parameterized cloud variability to ARM data.

Description: Cloud parameterizations in large-scale models often try to predict the amount of sub-grid scale variability in cloud properties to address the significant non-linear effects of radiation and precipitation. Statistical cloud schemes provide an attractive framework to self-consistently predict the variability in radiation and microphysics but require accurate predictions of the width and asymmetry of the distribution of cloud properties. Data from the Atmospheric Radiation Measurement (ARM) program are used to assess the variability in boundary layer cloud properties for a well- mixed stratocumulus observed at the Oklahoma ARM site during the March 2000 Intensive Observing Period. Cloud boundaries, liquid water content, and liquid water path are retrieved from the millimeter wavelength cloud radar and the microwave radiometer. Balloon soundings, aircraft data, and satellite observations provide complementary views on the horizontal cloud inhomogeneity. It is shown that the width of the liquid water path probability distribution function is consistent with a model in which horizontal fluctuations in liquid water content are vertically coherent throughout the depth of the cloud. Variability in cloud base is overestimated by this model, however; perhaps because an additional assumption that the variance of total water is constant with altitude throughout the depth of the boundary layer is incorrect.
Date: June 23, 2003
Creator: Klein, Stephen A. & Norris, Joel R.
Partner: UNT Libraries Government Documents Department

GFDL ARM Project Technical Report: Using ARM Observations to Evaluate Cloud and Convection Parameterizations & Cloud-Convection-Radiation Interactions in the GFDL Atmospheric General Circulation Model

Description: This report briefly summarizes the progress made by ARM postdoctoral fellow, Yanluan Lin, at GFDL during the period from October 2008 to present. Several ARM datasets have been used for GFDL model evaluation, understanding, and improvement. This includes a new ice fall speed parameterization with riming impact and its test in GFDL AM3, evaluation of model cloud and radiation diurnal and seasonal variation using ARM CMBE data, model ice water content evaluation using ARM cirrus data, and coordination of the TWPICE global model intercomparison. The work illustrates the potential and importance of ARM data for GCM evaluation, understanding, and ultimately, improvement of GCM cloud and radiation parameterizations. Future work includes evaluation and improvement of the new dynamicsPDF cloud scheme and aerosol activation in the GFDL model.
Date: June 17, 2010
Creator: Ramaswamy, V.; Donner, L. J.; Golaz, J-C. & Klein, S. A.
Partner: UNT Libraries Government Documents Department

Physics-based Modeling of Brittle Fracture: Cohesive Formulations and the Application of Meshfree Methods

Description: Simulation of generalized fracture and fragmentation remains an ongoing challenge in computational fracture mechanics. There are difficulties associated not only with the formulation of physically-based models of material failure, but also with the numerical methods required to treat geometries that change in time. The issue of fracture criteria is addressed in this work through a cohesive view of material, meaning that a finite material strength and work to fracture are included in the material description. In this study, we present both surface and bulk cohesive formulations for modeling brittle fracture, detailing the derivation of the formulations, fitting relations, and providing a critical assessment of their capabilities in numerical simulations of fracture. Due to their inherent adaptivity and robustness under severe deformation, meshfree methods are especially well-suited to modeling fracture behavior. We describe the application of meshfree methods to both bulk and surface approaches to cohesive modeling. We present numerical examples highlighting the capabilities and shortcomings of the methods in order to identify which approaches are best-suited to modeling different types of fracture phenomena.
Date: December 1, 2000
Creator: Klein, P. A.; Foulk, J. W.; Chen, E. P.; Wimmer, S. A. & Gao, H.
Partner: UNT Libraries Government Documents Department

A three-dimensional validation of crack curvature in muscovite mica

Description: Experimental and computational efforts focused on characterizing crack tip curvature in muscovite mica. Wedge-driven cracks were propagated under monochromatic light. Micrographs verified the subtle curvature of the crack front near the free surface. A cohesive approach was employed to model mixed-mode fracture in a three-dimensional framework. Finite element calculations captured the crack curvature observed in experiment.
Date: January 7, 2001
Creator: Hill, J. C.; III, J. W. Foulk; Klein, P. A. & Chen, E. P.
Partner: UNT Libraries Government Documents Department

An investigation of photovoltaic powered pumps in direct solar domestic hot water systems

Description: The performance of photovoltaic powered pumps in direct solar domestic hot water (PV-SDHW) systems has been studied. The direct PV- SDHW system employs a photovoltaic array, a separately excited DC- motor, a centrifugal pump, a thermal collector, and a storage tank. A search methodology for an optimum PV-SDHW system configuration has been proposed. A comparison is made between the long-term performance of a PV-SDHW system and a conventional SDHW system operating under three control schemes. The three schemes are: an ON-OFF flow controlled SDHW system operating at the manufacturer-recommended constant flow rate, and a linear proportional flow controlled SDHW system with the flow proportional to the solar radiation operating under an optimum proportionality. 13 refs., 6 figs.
Date: September 1, 1996
Creator: Al-Ibrahim, A.M.; Klein, S.A.; Mitchell, J.W. & Beckman, W.A.
Partner: UNT Libraries Government Documents Department

A micromechanical basis for partitioning the evolution of grainbridging in brittle materials

Description: A micromechanical model is developed for grain bridging inmonolithic ceramics. Specifically, bridge formation of a single,non-equiaxed grain spanning adjacent grains is addressed. A cohesive zoneframework enables crack initiation and propagation along grainboundaries. The evolution of the bridge is investigated through avariance in both grain angle and aspect ratio. We propose that thebridging process can be partitioned into five distinct regimes ofresistance: propagate, kink, arrest, stall, and bridge. Although crackpropagation and kinking are well understood, crack arrest and subsequent"stall" have been largely overlooked. Resistance during the stall regimeexposes large volumes of microstructure to stresses well in excess of thegrain boundary strength. Bridging can occur through continued propagationor reinitiation ahead of the stalled crack tip. The driving forcerequired to reinitiate is substantially greater than the driving forcerequired to kink. In addition, the critical driving force to reinitiateis sensitive to grain aspect ratio but relatively insensitive to grainangle. The marked increase in crack resistance occurs prior to bridgeformation and provides an interpretation for the rapidly risingresistance curves which govern the strength of many brittle materials atrealistically small flaw sizes.
Date: October 9, 2006
Creator: Foulk III, J.W.; Cannon, R.M.; Johnson, G.C.; Klein, P.A. & Ritchie, R.O.
Partner: UNT Libraries Government Documents Department

On the toughening of brittle materials by grain bridging:promoting intergranular fracture through grain angle, strength, andtoughness

Description: The structural reliability of many brittle materials such asstructural ceramics relies on the occurrence of intergranular, as opposedto transgranular, fracture in order to induce toughening by grainbridging. For a constant grain boundary strength and grain boundarytoughness, the current work examines the role of grain strength, graintoughness, and grain angle in promoting intergranular fracture in orderto maintain such toughening. Previous studies have illustrated that anintergranular path and the consequent grain bridging process can bepartitioned into five distinct regimes, namely: propagate, kink, arrest,stall and bridge. To determine the validity of the assumed intergranularpath, the classical penentration/deflection problem of a crack impingingon an interface is reexamined within a cohesive zone framework forintergranular and transgranular fracture. Results considering both modesof propagation, i.e., a transgranular and intergranular path, reveal thatcrack-tip shielding is a natural outcome of the cohesive zone approach tofracture. Cohesive zone growth in one mode shields the opposing mode fromthe stresses required for cohesive zone initiation. Although stablepropagation occurs when the required driving force is equivalent to thetoughness for either transgranular or intergranular fracture, the mode ofpropagation depends on the normalized grain strength, normalized graintoughness, and grain angle. For each grain angle, the intersection ofsingle path and multiple path solutions demarcates "strong" grains thatincrease the macroscopic toughness and "weak" grains that decrease it.The unstable transition to intergranular fracture reveals that anincreasinggrain toughness requires a growing region of the transgranularcohesive zone be at and near the peak cohesive strength. The inability ofthe body to provide the requisite stress field yields an overdriven andunstable configuration. The current results provide restrictions for theachievement of substantial toughening through intergranularfracture.
Date: November 15, 2007
Creator: Foulk III, J.W.; Johnson, G.C.; Klein, P.A. & Ritchie, R.O.
Partner: UNT Libraries Government Documents Department

Oak Ridge Isotope Products and Services - Current and Expected Supply and Demand

Description: Oak Ridge National Laboratory (ORNL) has been a major center of isotope production research, development, and distribution for over 50 years. Currently, the major isotope production activities include (1) the production of transuranium element radioisotopes, including 252 Cf; (2) the production of medical and industrial radioisotopes; (3) maintenance and expansion of the capabilities for production of enriched stable isotopes; and, (4) preparation of a wide range of custom-order chemical and physical forms of isotope products, particularly in accelerator physics research. The recent supply of and demand for isotope products and services in these areas, research and development (R&D), and the capabilities for future supply are described in more detail below. The keys to continuing the supply of these important products and services are the maintenance, improvement, and potential expansion of specialized facilities, including (1) the High Flux Isotope Reactor (HFIR), (2) the Radiochemical Engineering Development Center (REDC) and Radiochemical Development Laboratory (RDL) hot cell facilities, (3) the electromagnetic calutron mass separators and the plasma separation process equipment for isotope enrichment, and (4) the Isotope Research Materials Laboratory (IRML) equipment for preparation of specialized chemical and physical forms of isotope products. The status and plans for these ORNL isotope production facilities are also described below.
Date: August 29, 1999
Creator: Aaron, W.S.; Alexander, C.W.; Cline, R.L.; Collins, E.D.; Klein, J.A.; Knauer, J.B., Jr. et al.
Partner: UNT Libraries Government Documents Department

Modeling aerosol-cloud interactions with a self-consistent cloud scheme in a general circulation model

Description: This paper describes a self-consistent prognostic cloud scheme that is able to predict cloud liquid water, amount and droplet number (N{sub d}) from the same updraft velocity field, and is suitable for modeling aerosol-cloud interactions in general circulation models (GCMs). In the scheme, the evolution of droplets fully interacts with the model meteorology. An explicit treatment of cloud condensation nuclei (CCN) activation allows the scheme to take into account the contributions to N{sub d} of multiple types of aerosol (i.e., sulfate, organic and sea-salt aerosols) and kinetic limitations of the activation process. An implementation of the prognostic scheme in the Geophysical Fluid Dynamics Laboratory (GFDL) AM2 GCM yields a vertical distribution of N{sub d} characteristic of maxima in the lower troposphere differing from that obtained through diagnosing N{sub d} empirically from sulfate mass concentrations. As a result, the agreement of model-predicted present-day cloud parameters with satellite measurements is improved compared to using diagnosed N{sub d}. The simulations with pre-industrial and present-day aerosols show that the combined first and second indirect effects of anthropogenic sulfate and organic aerosols give rise to a global annual mean flux change of -1.8 W m{sup -2} consisting of -2.0 W m{sup -2} in shortwave and 0.2 W m{sup -2} in longwave, as model response alters cloud field, and subsequently longwave radiation. Liquid water path (LWP) and total cloud amount increase by 19% and 0.6%, respectively. Largely owing to high sulfate concentrations from fossil fuel burning, the Northern Hemisphere mid-latitude land and oceans experience strong cooling. So does the tropical land which is dominated by biomass burning organic aerosol. The Northern/Southern Hemisphere and land/ocean ratios are 3.1 and 1.4, respectively. The calculated annual zonal mean flux changes are determined to be statistically significant, exceeding the model's natural variations in the NH low and mid-latitudes and in ...
Date: May 2, 2005
Creator: Ming, Y; Ramaswamy, V; Donner, L J; Phillips, V T; Klein, S A; Ginoux, P A et al.
Partner: UNT Libraries Government Documents Department

Modeling the coupled mechanics, transport, and growth processes in collagen tissues.

Description: The purpose of this project is to develop tools to model and simulate the processes of self-assembly and growth in biological systems from the molecular to the continuum length scales. The model biological system chosen for the study is the tendon fiber which is composed mainly of Type I collagen fibrils. The macroscopic processes of self-assembly and growth at the fiber scale arise from microscopic processes at the fibrillar and molecular length scales. At these nano-scopic length scales, we employed molecular modeling and simulation method to characterize the mechanical behavior and stability of the collagen triple helix and the collagen fibril. To obtain the physical parameters governing mass transport in the tendon fiber we performed direct numerical simulations of fluid flow and solute transport through an idealized fibrillar microstructure. At the continuum scale, we developed a mixture theory approach for modeling the coupled processes of mechanical deformation, transport, and species inter-conversion involved in growth. In the mixture theory approach, the microstructure of the tissue is represented by the species concentration and transport and material parameters, obtained from fibril and molecular scale calculations, while the mechanical deformation, transport, and growth processes are governed by balance laws and constitutive relations developed within a thermodynamically consistent framework.
Date: November 1, 2006
Creator: Holdych, David J.; Nguyen, Thao D.; Klein, Patrick A.; in't Veld, Pieter J. & Stevens, Mark Jackson
Partner: UNT Libraries Government Documents Department

A New Ensemble of Perturbed-Input-Parameter Simulations by the Community Atmosphere Model

Description: Uncertainty quantification (UQ) is a fundamental challenge in the numerical simulation of Earth's weather and climate, and other complex systems. It entails much more than attaching defensible error bars to predictions: in particular it includes assessing low-probability but high-consequence events. To achieve these goals with models containing a large number of uncertain input parameters, structural uncertainties, etc., raw computational power is needed. An automated, self-adapting search of the possible model configurations is also useful. Our UQ initiative at the Lawrence Livermore National Laboratory has produced the most extensive set to date of simulations from the US Community Atmosphere Model. We are examining output from about 3,000 twelve-year climate simulations generated with a specialized UQ software framework, and assessing the model's accuracy as a function of 21 to 28 uncertain input parameter values. Most of the input parameters we vary are related to the boundary layer, clouds, and other sub-grid scale processes. Our simulations prescribe surface boundary conditions (sea surface temperatures and sea ice amounts) to match recent observations. Fully searching this 21+ dimensional space is impossible, but sensitivity and ranking algorithms can identify input parameters having relatively little effect on a variety of output fields, either individually or in nonlinear combination. Bayesian statistical constraints, employing a variety of climate observations as metrics, also seem promising. Observational constraints will be important in the next step of our project, which will compute sea surface temperatures and sea ice interactively, and will study climate change due to increasing atmospheric carbon dioxide.
Date: October 27, 2011
Creator: Covey, C; Brandon, S; Bremer, P T; Domyancis, D; Garaizar, X; Johannesson, G et al.
Partner: UNT Libraries Government Documents Department