78 Matching Results

Search Results

Advanced search parameters have been applied.

Financial Innovation Among the Community Wind Sector in the United States

Description: In the relatively brief history of utility-scale wind generation, the 'community wind' sector - defined here as consisting of relatively small utility-scale wind power projects that are at least partly owned by one or more members of the local community - has played a vitally important role as a 'test bed' or 'proving ground' for wind turbine manufacturers. In the 1980s and 1990s, for example, Vestas and other now-established European wind turbine manufacturers relied heavily on community wind projects in Scandinavia and Germany to install - and essentially field-test - new turbine designs. The fact that orders from community wind projects seldom exceeded more than a few turbines at a time enabled the manufacturers to correct any design flaws or manufacturing defects fairly rapidly, and without the risk of extensive (and expensive) serial defects that can accompany larger orders. Community wind has been slower to take root in the United States - the first such projects were installed in the state of Minnesota around the year 2000. Just as in Europe, however, the community wind sector in the U.S. has similarly served as a proving ground - but in this case for up-and-coming wind turbine manufacturers that are trying to break into the broader U.S. wind power market. For example, community wind projects have deployed the first U.S. installations of wind turbines from Suzlon (in 2003), DeWind (2008), Americas Wind Energy (2008) and later Emergya Wind Technologies (2010),1 Goldwind (2009), AAER/Pioneer (2009), Nordic Windpower (2010), Unison (2010), and Alstom (2011). Just as it has provided a proving ground for new turbines, so too has the community wind sector in the United States served as a laboratory for experimentation with innovative new financing structures. For example, a variation of one of the most common financing arrangements in the U.S. wind market ...
Date: January 19, 2011
Creator: Bolinger, Mark
Partner: UNT Libraries Government Documents Department

Persistence and transport potential of chemicals in a multimedia environment

Description: Persistence in the environment and potential for long-range transport are related since time in the environment is required for transport. A persistent chemical will travel longer distances than a reactive chemical that shares similar chemical properties. Scheringer (1997) has demonstrated the correlation between persistence and transport distance for different organic chemicals. However, this correlation is not sufficiently robust to predict one property from the other. Specific chemicals that are persistent mayor may not exhibit long-range transport potential. Persistence and long-range transport also present different societal concerns. Persistence concerns relate to the undesired possibility that chemicals produced and used now may somehow negatively affect future generations. Long-range transport concerns relate to the undesired presence of chemicals in areas where these compounds have not been used. Environmental policy decisions can be based on either or both considerations depending on the aim of the regulatory program. In this chapter, definitions and methods for quantifying persistence and transport potential of organic chemicals are proposed which will assist in the development of sound regulatory frameworks.
Date: February 1, 2000
Creator: van de Meent, D.; McKone, T.E.; Parkerton, T.; Matthies, M.; Scheringer, M.; Wania, F. et al.
Partner: UNT Libraries Government Documents Department

A View on Future Building System Modeling and Simulation

Description: This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).
Date: April 1, 2011
Creator: Wetter, Michael
Partner: UNT Libraries Government Documents Department

Scanning Transmission X-ray Microscopy: Applications in Atmospheric Aerosol Research

Description: Scanning transmission x-ray microscopy (STXM) combines x-ray microscopy and near edge x-ray absorption fine structure spectroscopy (NEXAFS). This combination provides spatially resolved bonding and oxidation state information. While there are reviews relevant to STXM/NEXAFS applications in other environmental fields (and magnetic materials) this chapter focuses on atmospheric aerosols. It provides an introduction to this technique in a manner approachable to non-experts. It begins with relevant background information on synchrotron radiation sources and a description of NEXAFS spectroscopy. The bulk of the chapter provides a survey of STXM/NEXAFS aerosol studies and is organized according to the type of aerosol investigated. The purpose is to illustrate the current range and recent growth of scientific investigations employing STXM-NEXAFS to probe atmospheric aerosol morphology, surface coatings, mixing states, and atmospheric processing.
Date: January 20, 2011
Creator: Moffet, Ryan C.; Tivanski, Alexei V. & Gilles, Mary K.
Partner: UNT Libraries Government Documents Department

Bridging the Gap in the Chemical Thermodynamic Database for Nuclear Waste Repository: Studies of the Effect of Temperature on Actinide Complexation

Description: Recent results of thermodynamic studies on the complexation of actinides (UO{sub 2}{sup 2+}, NpO{sub 2}{sup +} and Pu{sup 4+}) with F{sup -}, SO{sub 4}{sup 2-} and H{sub 2}PO{sub 4}{sup -}/HPO{sub 4}{sup 2-} at elevated temperatures are reviewed. The data indicate that, for all systems except the 1:1 complexation of Np(V) with HPO{sub 4}{sup 2-}, the complexation of actinides is enhanced by the increase in temperature. The enhancement is primarily due to the increase in the entropy term (T{Delta}S) that exceeds the increase in the enthalpy ({Delta}H) as the temperature is increased. These data bridge the gaps in the chemical thermodynamic database for nuclear waste repository where the temperature could remain significantly higher than 25 C for a long time after the closure of the repository.
Date: December 21, 2009
Creator: Rao, Linfeng; Tian, Guoxin; Xia, Yuanxian; Friese, Judah I.; Zanonato, PierLuigi & Di Bernardo, Plinio
Partner: UNT Libraries Government Documents Department


Description: In this chapter we review the spectroscopic data for actinide molecules and the reaction dynamics for atomic and molecular actinides that have been examined in the gas phase or in inert cryogenic matrices. The motivation for this type of investigation is that physical properties and reactions can be studied in the absence of external perturbations (gas phase) or under minimally perturbing conditions (cryogenic matrices). This information can be compared directly with the results from high-level theoretical models. The interplay between experiment and theory is critically important for advancing our understanding of actinide chemistry. For example, elucidation of the role of the 5f electrons in bonding and reactivity can only be achieved through the application of experimentally verified theoretical models. Theoretical calculations for the actinides are challenging due the large numbers of electrons that must be treated explicitly and the presence of strong relativistic effects. This topic has been reviewed in depth in Chapter 17 of this series. One of the goals of the experimental work described in this chapter has been to provide benchmark data that can be used to evaluate both empirical and ab initio theoretical models. While gas-phase data are the most suitable for comparison with theoretical calculations, there are technical difficulties entailed in generating workable densities of gas-phase actinide molecules that have limited the range of species that have been characterized. Many of the compounds of interest are refractory, and problems associated with the use of high temperature vapors have complicated measurements of spectra, ionization energies, and reactions. One approach that has proved to be especially valuable in overcoming this difficulty has been the use of pulsed laser ablation to generate plumes of vapor from refractory actinide-containing materials. The vapor is entrained in an inert gas, which can be used to cool the actinide species to room ...
Date: February 1, 2009
Creator: Heaven, Michael C.; Gibson, John K. & Marcalo, Joaquim
Partner: UNT Libraries Government Documents Department

Quantitive DNA Fiber Mapping

Description: Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.
Date: January 28, 2008
Creator: Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F. & Weier, Heinz-Ulli G.
Partner: UNT Libraries Government Documents Department

The effect of temperature on the speciation of U(VI) in sulfate solutions

Description: Sulfate, one of the inorganic constituents that could be present in the nuclear waste repository, forms complexes with U(VI) and affects its migration in the environment. Results show that the complexation of U(VI) with sulfate is enhanced by the increase in temperature. The effect of temperature on the complexation and speciation of U(VI) in sulfate solutions is discussed.
Date: September 15, 2008
Creator: Rao, Linfeng & Tian, Guoxin
Partner: UNT Libraries Government Documents Department

Genomic Prospecting for Microbial Biodiesel Production

Description: Biodiesel is defined as fatty acid mono-alkylesters and is produced from triacylglycerols. In the current article we provide an overview of the structure, diversity and regulation of the metabolic pathways leading to intracellular fatty acid and triacylglycerol accumulation in three types of organisms (bacteria, algae and fungi) of potential biotechnological interest and discuss possible intervention points to increase the cellular lipid content. The key steps that regulate carbon allocation and distribution in lipids include the formation of malonyl-CoA, the synthesis of fatty acids and their attachment onto the glycerol backbone, and the formation of triacylglycerols. The lipid biosynthetic genes and pathways are largely known for select model organisms. Comparative genomics allows the examination of these pathways in organisms of biotechnological interest and reveals the evolution of divergent and yet uncharacterized regulatory mechanisms. Utilization of microbial systems for triacylglycerol and fatty acid production is in its infancy; however, genomic information and technologies combined with synthetic biology concepts provide the opportunity to further exploit microbes for the competitive production of biodiesel.
Date: March 20, 2008
Creator: Lykidis, Athanasios; Lykidis, Athanasios & Ivanova, Natalia
Partner: UNT Libraries Government Documents Department

Hadronic Correlations and Fluctuations

Description: We will provide a review of some of the physics which can be addressed by studying fluctuations and correlations in heavy ion collisions. We will discuss Lattice QCD results on fluctuations and correlations and will put them into context with observables which have been measured in heavy-ion collisions. Special attention will be given to the QCD critical point and the first order co-existence region, and we will discuss how the measurement of fluctuations and correlations can help in an experimental search for non-trivial structures in the QCD phase diagram.
Date: October 9, 2008
Creator: Koch, Volker
Partner: UNT Libraries Government Documents Department

Collective phenomena in non-central nuclear collisions

Description: Recent developments in the field of anisotropic flow in nuclear collision are reviewed. The results from the top AGS energy to the top RHIC energy are discussed with emphasis on techniques, interpretation, and uncertainties in the measurements.
Date: October 20, 2008
Creator: Voloshin, Sergei A.; Poskanzer, Arthur M. & Snellings, Raimond
Partner: UNT Libraries Government Documents Department

Chapter 8: Selective Stoichiometric and Catalytic Reactivity in the Confines of a Chiral Supramolecular Assembly

Description: Nature uses enzymes to activate otherwise unreactive compounds in remarkable ways. For example, DNases are capable of hydrolyzing phosphate diester bonds in DNA within seconds,[1-3]--a reaction with an estimated half-life of 200 million years without an enzyme.[4] The fundamental features of enzyme catalysis have been much discussed over the last sixty years in an effort to explain the dramatic rate increases and high selectivities of enzymes. As early as 1946, Linus Pauling suggested that enzymes must preferentially recognize and stabilize the transition state over the ground state of a substrate.[5] Despite the intense study of enzymatic selectivity and ability to catalyze chemical reactions, the entire nature of enzyme-based catalysis is still poorly understood. For example, Houk and co-workers recently reported a survey of binding affinities in a wide variety of enzyme-ligand, enzyme-transition-state, and synthetic host-guest complexes and found that the average binding affinities were insufficient to generate many of the rate accelerations observed in biological systems.[6] Therefore, transition-state stabilization cannot be the sole contributor to the high reactivity and selectivity of enzymes, but rather, other forces must contribute to the activation of substrate molecules. Inspired by the efficiency and selectivity of Nature, synthetic chemists have admired the ability of enzymes to activate otherwise unreactive molecules in the confines of an active site. Although much less complex than the evolved active sites of enzymes, synthetic host molecules have been developed that can carry out complex reactions with their cavities. While progress has been made toward highly efficient and selective reactivity inside of synthetic hosts, the lofty goal of duplicating enzymes specificity remains.[7-9] Pioneered by Lehn, Cram, Pedersen, and Breslow, supramolecular chemistry has evolved well beyond the crown ethers and cryptands originally studied.[10-12] Despite the increased complexity of synthetic host molecules, most assembly conditions utilize self-assembly to form complex highly-symmetric structures from ...
Date: September 27, 2007
Creator: University of California, Berkeley; Laboratory, Lawrence Berkeley National; Raymond, Kenneth; Pluth, Michael D.; Bergman, Robert G. & Raymond, Kenneth N.
Partner: UNT Libraries Government Documents Department

Chapter 9: Electronics

Description: Sophisticated front-end electronics are a key part of practically all modern radiation detector systems. This chapter introduces the basic principles and their implementation. Topics include signal acquisition, electronic noise, pulse shaping (analog and digital), and data readout techniques.
Date: December 19, 2006
Creator: Grupen, Claus & Shwartz, Boris A.
Partner: UNT Libraries Government Documents Department

Structural Genomics of Minimal Organisms: Pipeline and Results

Description: The initial objective of the Berkeley Structural Genomics Center was to obtain a near complete three-dimensional (3D) structural information of all soluble proteins of two minimal organisms, closely related pathogens Mycoplasma genitalium and M. pneumoniae. The former has fewer than 500 genes and the latter has fewer than 700 genes. A semiautomated structural genomics pipeline was set up from target selection, cloning, expression, purification, and ultimately structural determination. At the time of this writing, structural information of more than 93percent of all soluble proteins of M. genitalium is avail able. This chapter summarizes the approaches taken by the authors' center.
Date: September 14, 2007
Creator: Kim, Sung-Hou; Shin, Dong-Hae; Kim, Rosalind; Adams, Paul & Chandonia, John-Marc
Partner: UNT Libraries Government Documents Department

The CKM quark-mixing matrix

Description: No abstract prepared.
Date: April 1, 2006
Creator: Ligeti, Zoltan; Ceccucci, Augusto; Ligeti, Zoltan & Sakai, Yoshihide
Partner: UNT Libraries Government Documents Department

Macroscopic Modeling of Polymer-Electrolyte Membranes

Description: In this chapter, the various approaches for the macroscopic modeling of transport phenomena in polymer-electrolyte membranes are discussed. This includes general background and modeling methodologies, as well as exploration of the governing equations and some membrane-related topic of interest.
Date: April 1, 2007
Creator: Weber, A.Z. & Newman, J.
Partner: UNT Libraries Government Documents Department

Automated Structure Solution with the PHENIX Suite

Description: Significant time and effort are often required to solve and complete a macromolecular crystal structure. The development of automated computational methods for the analysis, solution and completion of crystallographic structures has the potential to produce minimally biased models in a short time without the need for manual intervention. The PHENIX software suite is a highly automated system for macromolecular structure determination that can rapidly arrive at an initial partial model of a structure without significant human intervention, given moderate resolution and good quality data. This achievement has been made possible by the development of new algorithms for structure determination, maximum-likelihood molecular replacement (PHASER), heavy-atom search (HySS), template and pattern-based automated model-building (RESOLVE, TEXTAL), automated macromolecular refinement (phenix.refine), and iterative model-building, density modification and refinement that can operate at moderate resolution (RESOLVE, AutoBuild). These algorithms are based on a highly integrated and comprehensive set of crystallographic libraries that have been built and made available to the community. The algorithms are tightly linked and made easily accessible to users through the PHENIX Wizards and the PHENIX GUI.
Date: June 9, 2008
Creator: Zwart, Peter H.; Zwart, Peter H.; Afonine, Pavel; Grosse-Kunstleve, Ralf W.; Hung, Li-Wei; Ioerger, Tom R. et al.
Partner: UNT Libraries Government Documents Department

Philosophy of Mind and the Problem of FreeWill in the Light of Quantum Mechanics.

Description: Arguments pertaining to the mind-brain connection and to the physical effectiveness of our conscious choices have been presented in two recent books, one by John Searle, the other by Jaegwon Kim. These arguments are examined, and it is argued that the difficulties encountered arise from a defective understanding and application of a pertinent part of contemporary science, namely quantum mechanics.
Date: April 1, 2008
Creator: Stapp, Henry & Stapp, Henry P
Partner: UNT Libraries Government Documents Department

Topological Cacti: Visualizing Contour-based Statistics

Description: Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.
Date: May 26, 2011
Creator: Weber, Gunther H.; Bremer, Peer-Timo & Pascucci, Valerio
Partner: UNT Libraries Government Documents Department

Network Communication as a Service-Oriented Capability

Description: In widely distributed systems generally, and in science-oriented Grids in particular, software, CPU time, storage, etc., are treated as"services" -- they can be allocated and used with service guarantees that allows them to be integrated into systems that perform complex tasks. Network communication is currently not a service -- it is provided, in general, as a"best effort" capability with no guarantees and only statistical predictability. In order for Grids (and most types of systems with widely distributed components) to be successful in performing the sustained, complex tasks of large-scale science -- e.g., the multi-disciplinary simulation of next generation climate modeling and management and analysis of the petabytes of data that will come from the next generation of scientific instrument (which is very soon for the LHC at CERN) -- networks must provide communication capability that is service-oriented: That is it must be configurable, schedulable, predictable, and reliable. In order to accomplish this, the research and education network community is undertaking a strategy that involves changes in network architecture to support multiple classes of service; development and deployment of service-oriented communication services, and; monitoring and reporting in a form that is directly useful to the application-oriented system so that it may adapt to communications failures. In this paper we describe ESnet's approach to each of these -- an approach that is part of an international community effort to have intra-distributed system communication be based on a service-oriented capability.
Date: January 8, 2008
Creator: Johnston, William; Johnston, William; Metzger, Joe; Collins, Michael; Burrescia, Joseph; Dart, Eli et al.
Partner: UNT Libraries Government Documents Department


Description: But a glance at the Livingston chart, Fig. 1, of accelerator particle energy as a function of time shows that the energy has steadily, exponentially, increased. Equally significant is the fact that this increase is the envelope of diverse technologies. If one is to stay on, or even near, the Livingston curve in future years then new acceleration techniques need to be developed. What are the new acceleration methods? In these two lectures I would like to sketch some of these new ideas. I am well aware that they will probably not result in high energy accelerators within this or the next decade, but conversely, it is likely that these ideas will form the basis for the accelerators of the next century. Anyway, the ideas are stimulating and suffice to show that accelerator physicists are not just 'engineers', but genuine scientists deserving to be welcomed into the company of high energy physicists. I believe that outsiders will find this field surprisingly fertile and, certainly fun. To put it more personally, I very much enjoy working in this field and lecturing on it. There are a number of review articles which should be consulted for references to the original literature. In addition there are three books on the subject. Given this material, I feel free to not completely reference the material in the remainder of this article; consultation of the review articles and books will be adequate as an introduction to the literature for references abound (hundreds are given). At last, by way of introduction, I should like to quote from the end of Ref. 2 for I think the remarks made there are most germane. Remember that the talk was addressed to accelerator physicists: 'Finally, it is often said, I think by physicists who are not well-informed, that accelerator builders ...
Date: July 1, 1984
Creator: Sessler, A.M.
Partner: UNT Libraries Government Documents Department


Description: High energy physics, perhaps more than any other branch of science, is driven by technology. It is not the development of theory, or consideration of what measurements to make, which are the driving elements in our science. Rather it is the development of new technology which is the pacing item. Thus it is the development of new techniques, new computers, and new materials which allows one to develop new detectors and new particle-handling devices. It is the latter, the accelerators, which are at the heart of the science. Without particle accelerators there would be, essentially, no high energy physics. In fact. the advances in high energy physics can be directly tied to the advances in particle accelerators. Looking terribly briefly, and restricting one's self to recent history, the Bevatron made possible the discovery of the anti-proton and many of the resonances, on the AGS was found the {mu}-neutrino, the J-particle and time reversal non-invariance, on Spear was found the {psi}-particle, and, within the last year the Z{sub 0} and W{sup {+-}} were seen on the CERN SPS p-{bar p} collider. Of course one could, and should, go on in much more detail with this survey, but I think there is no need. It is clear that as better acceleration techniques were developed more and more powerful machines were built which, as a result, allowed high energy physics to advance. What are these techniques? They are very sophisticated and ever-developing. The science is very extensive and many individuals devote their whole lives to accelerator physics. As high energy experimental physicists your professional lives will be dominated by the performance of 'the machine'; i.e. the accelerator. Primarily you will be frustrated by the fact that it doesn't perform better. Why not? In these lectures, six in all, you should receive some appreciation ...
Date: July 1, 1984
Creator: Sessler, A.M.
Partner: UNT Libraries Government Documents Department

Metabolic engineering of E.coli for the production of a precursor to artemisinin, an anti-malarial drug

Description: This document is Chapter 25 in the Manual of Industrial Microbiology and Biotechnology, 3rd edition. Topics covered include: Incorporation of Amorpha-4,11-Diene Biosynthetic Pathway into E. coli; Amorpha-4,11-Diene Pathway Optimization; "-Omics" Analyses for Increased Amorpha-4,11-Diene Production; Biosynthetic Oxidation of Amorpha-4,11-Diene.
Date: July 18, 2011
Creator: Petzold, Christopher & Keasling, Jay
Partner: UNT Libraries Government Documents Department