648 Matching Results

Search Results

Advanced search parameters have been applied.

Interactive design center.

Description: Sandia's advanced computing resources provide researchers, engineers and analysts with the ability to develop and render highly detailed large-scale models and simulations. To take full advantage of these multi-million data point visualizations, display systems with comparable pixel counts are needed. The Interactive Design Center (IDC) is a second generation visualization theater designed to meet this need. The main display integrates twenty-seven projectors in a 9-wide by 3-high array with a total display resolution of more than 35 million pixels. Six individual SmartBoard displays offer interactive capabilities that include on-screen annotation and touch panel control of the facility's display systems. This report details the design, implementation and operation of this innovative facility.
Date: July 1, 2005
Creator: Pomplun, Alan R. (Sandia National Laboratories, Livermore, CA)
Partner: UNT Libraries Government Documents Department

The development of a spreadsheet-aided-engineering design tool for parachutes

Description: A spreadsheet-aided engineering design tool has been developed to assist in the parachute design process. The new tool was developed during FY96 and utilized in the design of the flight termination parachute system for a 1900 lb. payload. Many modifications were made during the initial utilization of this tool. Work on the tool continues as the authors attempt to create an application tool for the parachute engineer.
Date: April 1, 1997
Creator: Waye, D.E. & Whinery, L.D.
Partner: UNT Libraries Government Documents Department

Synthesis of logic circuits with evolutionary algorithms

Description: In the last decade there has been interest and research in the area of designing circuits with genetic algorithms, evolutionary algorithms, and genetic programming. However, the ability to design circuits of the size and complexity required by modern engineering design problems, simply by specifying required outputs for given inputs has as yet eluded researchers. This paper describes current research in the area of designing logic circuits using an evolutionary algorithm. The goal of the research is to improve the effectiveness of this method and make it a practical aid for design engineers. A novel method of implementing the algorithm is introduced, and results are presented for various multiprocessing systems. In addition to evolving standard arithmetic circuits, work in the area of evolving circuits that perform digital signal processing tasks is described.
Date: January 26, 2000
Partner: UNT Libraries Government Documents Department

Model building techniques for analysis.

Description: The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.
Date: September 1, 2009
Creator: Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean et al.
Partner: UNT Libraries Government Documents Department

A methodology for selecting an optimal experimental design for the computer analysis of a complex system

Description: Investigation and evaluation of a complex system is often accomplished through the use of performance measures based on system response models. The response models are constructed using computer-generated responses supported where possible by physical test results. The general problem considered is one where resources and system complexity together restrict the number of simulations that can be performed. The levels of input variables used in defining environmental scenarios, initial and boundary conditions and for setting system parameters must be selected in an efficient way. This report describes an algorithmic approach for performing this selection.
Date: February 3, 2000
Partner: UNT Libraries Government Documents Department

A Resampling Based Approach to Optimal Experimental Design for Computer Analysis of a Complex System

Description: The investigation of a complex system is often performed using computer generated response data supplemented by system and component test results where possible. Analysts rely on an efficient use of limited experimental resources to test the physical system, evaluate the models and to assure (to the extent possible) that the models accurately simulate the system order investigation. The general problem considered here is one where only a restricted number of system simulations (or physical tests) can be performed to provide additional data necessary to accomplish the project objectives. The levels of variables used for defining input scenarios, for setting system parameters and for initializing other experimental options must be selected in an efficient way. The use of computer algorithms to support experimental design in complex problems has been a topic of recent research in the areas of statistics and engineering. This paper describes a resampling based approach to form dating this design. An example is provided illustrating in two dimensions how the algorithm works and indicating its potential on larger problems. The results show that the proposed approach has characteristics desirable of an algorithmic approach on the simple examples. Further experimentation is needed to evaluate its performance on larger problems.
Date: August 4, 1999
Creator: Rutherford, Brian
Partner: UNT Libraries Government Documents Department

Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

Description: Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry.
Date: August 23, 2005
Creator: Schwarz, Randolph; Carter, Leland L. & Schwarz, Alysia
Partner: UNT Libraries Government Documents Department

DC 12m telescope. Preliminary calculations. Investigation of elevation axis position.

Description: This paper examines some simple calculations of a 2D model of a telescope in order to understand how different design parameters affect the design. For the design of a telescope it is assumed that they need a design that minimizes deflections of the dish and also minimizes the size of the motors and torques needed to rotate in elevation. A common belief is that a lighter dish and minimum counterweight is desirable. However, these calculations show this is not necessarily true. The torque needed for rotation depends on the moment of inertia and if the telescope is balanced about the elevation axis. A light dish with no CW requires that the elevation axis be several meters in front of the dish (8-9m) in order to be balanced. This is not practical from a structural point of view. If the elevation axis is only 2m in front of the dish and there is no counterweight then the telescope will be unbalanced and the toruqes required will be very high - much higher than the torques needed only to overcome inertia. A heavy dish though can act as its own counterweight and the elevation axis only has to be 2-3m in front of the dish in order to achieve a balanced telescope. Also the struts that support the camera from the dish place a load on the dish which will put a bending moment on the dish. This bending moment will deform the dish and require it to be stiffer. A counterweight structure performs two functions. First, it allows the telescope to be balanced about the elevation axis. Second, it applies a force on the dish that opposes the forces from the camera struts, thereby reducing the bending moment and deformations of the dish.
Date: December 18, 2009
Creator: Guarino, V. J. & Physics, High Energy
Partner: UNT Libraries Government Documents Department

JC Penney Retail Renovation, June 2011

Description: JC Penney is a partner with the DOE's Commercial Building Partnerships (CBP) program, working with PNNL to explore energy design measures (EDMs) that may be applied to their building portfolio. A site in Colonial Heights, VA was chosen for a retrofit project; computer modeling predicts 45% improved energy performance compared to baseline operations. This case study reviews EDMs that were selected and their performance as of June 2011.
Date: June 30, 2011
Creator: Baechler, Michael C.; Rosenberg, Michael I.; Zhang, Jian; Ruiz, Kathleen A. & Wilburn, Matthew S.
Partner: UNT Libraries Government Documents Department

Rapid Risk Assessment: FY05 Annual Summary Report

Description: The Pacific Northwest National Laboratory (PNNL) is developing decision support tools that will assist in the transition of incident information into Protective Action Recommendations (PARs) that are understandable and can be executed in a real-world, operational environment. During emergencies, responders must rapidly assess risks and decide on the best course of action—all within minutes to hours. PNNL is blending existing modeling and decision support technology to develop new methods for transitioning science-based threat assessment to PARs. The rapid risk assessment tool will be both understandable and applicable to the emergency management community and would be a valuable tool during any water security-related incident. In 2005, PNNL demonstrated the integration of the multi-thematic modeling with emergency management decision support tools to create a Rapid Risk Assessment (RRA) tool that will transition risk to PARs that assist in responding to or mitigating the direct and indirect impacts of the incident(s). The RRA tool does this by aligning multi-thematic modeling capabilities with real-world response zones established by emergency and site operations managers. The RRA tool uses the risk assessment tool to drive prognostic models that use the type of incident, time of impact, severity of impact, and duration of impact to select the most appropriate PAR. Because PARs (and the thresholds by which they are selected) are jointly established by the technologists and the emergency management and operations decision makers, the science-based risk assessment can transition into a recommendation that can be understood and executed by people in the field.
Date: March 6, 2006
Creator: Whelan, Gene; Millard, W. David; Gelston, Gariann M.; Pelton, Mitch A.; Yang, Zhaoqing; Strenge, Dennis L. et al.
Partner: UNT Libraries Government Documents Department

Automated mask creation from a 3D model using Faethm.

Description: We have developed and implemented a method which given a three-dimensional object can infer from topology the two-dimensional masks needed to produce that object with surface micro-machining. The masks produced by this design tool can be generic, process independent masks, or if given process constraints, specific for a target process. This design tool calculates the two-dimensional mask set required to produce a given three-dimensional model by investigating the vertical topology of the model.
Date: November 1, 2007
Creator: Schiek, Richard Louis & Schmidt, Rodney Cannon
Partner: UNT Libraries Government Documents Department

A Framework to Design and Optimize Chemical Flooding Processes

Description: The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.
Date: August 31, 2006
Creator: Delshad, Mojdeh; Pope, Gary A. & Sepehrnoori, Kamy
Partner: UNT Libraries Government Documents Department

SIAM Conference on Geometric Design and Computing. Final Technical Report

Description: The SIAM Conference on Geometric Design and Computing attracted 164 domestic and international researchers, from academia, industry, and government. It provided a stimulating forum in which to learn about the latest developments, to discuss exciting new research directions, and to forge stronger ties between theory and applications. Final Report
Date: March 11, 2002
Partner: UNT Libraries Government Documents Department

Free form fabrication of metallic components using laser engineered net shaping (LENS{trademark})

Description: Solid free form fabrication is one of the fastest growing automated manufacturing technologies that has significantly impacted the length of time between initial concept and actual part fabrication. Starting with CAD renditions of new components, several techniques such as stereolithography and selective laser sintering are being used to fabricate highly accurate complex three-dimensional concept models using polymeric materials. Coupled with investment casting techniques, sacrificial polymeric objects are used to minimize costs and time to fabricate tooling used to make complex metal castings. This paper will describe recent developments in a new technology, known as LENS{sup {trademark}} (Laser Engineered Net Shaping), to fabricate metal components directly from CAD solid models and thus further reduce the lead times for metal part fabrication. In a manner analogous to stereolithography or selective sintering, the LENS{sup {trademark}} process builds metal parts line by line and layer by layer. Metal particles are injected into a laser beam, where they are melted and deposited onto a substrate as a miniature weld pool. The trace of the laser beam on the substrate is driven by the definition of CAD models until the desired net-shaped densified metal component is produced.
Date: September 1, 1996
Creator: Griffith, M.L.; Keicher, D.M. & Atwood, C.L.
Partner: UNT Libraries Government Documents Department

The development of a flight termination parachute system for a 1900 lb payload

Description: A 30-ft-diameter ringslot/solid parachute was designed, developed, and tested at Sandia National Laboratories as the major component of a flight termination system required for a 1900-lb gliding delivery platform. Four full-scale sled tests were performed to validate the design models of the parachute, determine reefing line length, demonstrate structural adequacy of the parachute materials, and demonstrate that performance met the design requirements.
Date: April 1, 1997
Creator: Waye, D.E.
Partner: UNT Libraries Government Documents Department

Approaches to instrument design at pulsed neutron sources

Description: A number of tools are used in the design of scattering instruments for pulsed neutron sources. Initial design is based largely on simple analytical calculations. More complicated analytical calculations and Monte Carlo simulations come into play as the design is optimized to maximize the data rate and to improve the data quality. Examples are used to illustrate the relative roles of these different computational tools. Areas are also identified where appropriate computational tools are currently lacking.
Date: April 14, 1997
Creator: Crawford, R.K.
Partner: UNT Libraries Government Documents Department

Shortest Path Planning for a Tethered Robot or an Anchored Cable

Description: We consider the problem of planning shortest paths for a tethered robot with a finite length tether in a 2D environment with polygonal obstacles. We present an algorithm that runs in time O((k{sub 1} + 1){sup 2}n{sup 4}) and finds the shortest path or correctly determines that none exists that obeys the constraints; here n is the number obstacle vertices, and k{sub 1} is the number loops in the initial configuration of the tether. The robot may cross its tether but nothing can cross obstacles, which cause the tether to bend. The algorithm applies as well for planning a shortest path for the free end of an anchored cable.
Date: February 22, 1999
Creator: Xavier, P.G.
Partner: UNT Libraries Government Documents Department

Density effect of inspection data points in as-built modeling of parts

Description: At Los Alamos National Laboratory, the use of inspection data generated at various stages of the life cycle of a product is being investigated in a feedback process to the design engineers and physicists. This data will be used to determine through analysis how to optimize assembly, mitigate nominal deviations, and confront aging issues. This as-built engineering philosophy characterizes a system through the topographical data generated through inspection. Through intricate modeling techniques, the topographical definition gives rise to a solid model in a Computer Aided Engineering (CAE) software package such as Parametric Technologies Pro/ENGINEER{trademark}. Once a solid model has been built, the definition can be used for a variety of analytical purposes including mass property calculations, finite element model generation, and virtual environment generation. A strictly analytical approach was used to exercise the as-built engineering method in characterizing components. A hypothetical component was used and mass properties were calculated analytically to provide nominal definition. This was then compared to mass properties calculated as a result of modeling theoretical inspection data in two formats; manual-collected data such as that obtained from a Coordinate Measuring Machine (specifically the Brown and Sharp) inspection process and automated-collected data such as obtained from a Sheffield inspection process. Mass properties calculated from a solid model generated using the Pro/ENGINEER{trademark} modeling operations were also compared with the nominal definition.
Date: December 31, 1998
Creator: Hefele, J. & Dolin, R.M.
Partner: UNT Libraries Government Documents Department

Virtual reality visualization of accelerator magnets

Description: The authors describe the use of the CAVE virtual reality visualization environment as an aid to the design of accelerator magnets. They have modeled an elliptical multipole wiggler magnet being designed for use at the Advanced Photon Source at Argonne National Laboratory. The CAVE environment allows the authors to explore and interact with the 3-D visualization of the magnet. Capabilities include changing the number of periods the magnet displayed, changing the icons used for displaying the magnetic field, and changing the current in the electromagnet and observing the effect on the magnetic field and particle beam trajectory through the field.
Date: May 1, 1995
Creator: Huang, M.; Papka, M.; DeFanti, T.; Levine, D.; Turner, L. & Kettunen, L.
Partner: UNT Libraries Government Documents Department

Tolerance analysis and variational solid geometry

Description: The fields of tolerancing and assembly analysis have depended for decades on ad hoc, shop floor methods. This causes serious problems when subjected toleranced designs to automated, analytical methods. This project attempted to further the formalization and mathematization of tolerancing by extending the concept of the Maximum Material Part. A software system was envisioned that would guide designers in the use of appropriate tolerance specifications and then create software models of Maximum Material Parts from the toleranced nominal parts.
Date: January 1, 1998
Creator: Watterberg, P.
Partner: UNT Libraries Government Documents Department

Designing double-gap linear accelerators for a wide mass range

Description: For applications like ion implantation, rf linacs using double-gap structures with external resonators can be used because they are practical at low frequencies. However, since the two gaps associated with a given resonator cannot be individually phased, it is not obvious how to build a linac that can efficiently accelerate particles having different mass/charge ratios. This paper describes the beam dynamics of double-gap rf linacs and shows how to maximize the range of mass/charge ratios. The theory also tells one how to rescale a linac tune (i.e., reset the voltages and phases) so that a new particle, having a different mass or charge, will behave similarly to the original particle.
Date: December 31, 1998
Creator: Lysenko, W.P.; Wadlinger, E.A.; Rusnak, B.; Krawczyk, F.; Saadatmand, K. & Wan, Z.
Partner: UNT Libraries Government Documents Department

Rapid space hardware development through computer-automated testing

Description: FORTE, the Fast On-Orbit Recording of Transient Events small satellite designed and built by Los Alamos and Sandia National Laboratories, is scheduled for launch in August, 1997. In the spirit of {open_quotes}better, cheaper, faster{close_quotes} satellites, the RF experiment hardware (receiver and trigger sub-systems) necessitated rapid prototype testing and characterization in the development of space-flight components. This was accomplished with the assembly of engineering model hardware prior to construction of flight hardware and the design of component-specific, PC-based software control libraries. Using the LabVIEW{reg_sign} graphical programming language, together with off-the-shelf PC digital I/O and GPIB interface cards, hardware control and complete automation of test equipment was possible from one PC. Because the receiver and trigger sub-systems employed complex functions for signal discrimination and transient detection, thorough validation of all functions and illumination of any faults were priorities. These methods were successful in accelerating the development and characterization of space-flight components prior to integration and allowed more complete data to be gathered than could have been accomplished without automation. Additionally, automated control of input signal sources was carried over from bench-level to system-level with the use of networked Linux workstation utilizing a GPIB interface.
Date: October 1, 1997
Creator: Masters, D.S. & Ruud, K.K.
Partner: UNT Libraries Government Documents Department