37 Matching Results

Search Results

Advanced search parameters have been applied.

KENO IV: an improved Monte Carlo criticality program

Description: KENO IV is a multigroup Monte Carlo criticality program written for the IBM 360 computers. It executes rapidly and is flexibly dimensioned so the allowed size of a problem (i.e., the number of energy groups, number of geometry cards, etc., are arbitrary) is limited only by the total data storage required. The input data, with the exception of cross sections, fission spectra and albedos, may be entered in free form. The geometry input is quite simple to prepare and complicated three-dimensional systems can often be described with a minimum of effort. The results calculated by KENO IV include k-effective, lifetime and generation time, energy-dependent leakages and absorptions, energy- and region-dependent fluxes and region-dependent fission densities. Criticality searches can be made on unit dimensions or on the number of units in an array. A summary of the theory utilized by KENO IV, a section describing the logical program flow, a compilation of the error messages printed by the code and a comprehensive data guide for preparing input to the code are presented. 14 references. (auth)
Date: November 1, 1975
Creator: Petrie, L.M. & Cross, N.F.
Partner: UNT Libraries Government Documents Department

Assessment of computational performance in nuclear criticality

Description: This report presents the results of a study undertaken to resolve the long-standing discrepanies between calculations and experiments involving arrays of fissile solution units. Room return was found to be sufficient to account for the discrepancy of some bare arrays, but reflected arrays are still in disagreement, and the magnitude of the room return raises other unresolved issues.
Date: January 1, 1985
Creator: Petrie, L.M. & Thomas, J.T.
Partner: UNT Libraries Government Documents Department

Standardized safety analysis of nuclear fuel shipping containers

Description: The objective of this effort is a modular system of computer programs called SCALE. (SCALE is an acronym for Standardized Computer-Analysis for Licensing Evaluation.) The anticipated NRC applications and design criteria for the SCALE system are described in a companion paper by R.H. Odegaarden. The purpose of the present paper is to describe the components and capabilities of the initial version of the SCALE system, with emphasis being placed on those aspects of the system that lead to analytical standardization. The SCALE system draws heavily from basic neutron-transport, data-processing, and heat-transfer methods technology developed at Oak Ridge over the past several years. The data-processing is a direct outgrowth of that employed in AMPX, a modular code system for processing coupled neutron-photon cross sections from ENDF/B. Modified versions of the AMPX problem-dependent data-processing modules NITAWL and XSDRNPM are incorporated into SCALE. However, even though some of the functions performed in AMPX and SCALE are the same, the overall purpose and organizational structure of SCALE is substantially different from that of AMPX.
Date: January 1, 1978
Creator: Westfall, R.M. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

Shipping cask criticality analysis utilizing combinatorial geometry with KENO-IV

Description: KENO-IV/CG represents an important step forward in geometric modeling capability for criticality analysis. With the merging of KENO geometry's repeating cell feature and combinatorial geometry's detailed modeling ability and simplified input specifications, many geometric approximations required for previous criticality calculations are no longer necessary. Also, many of the features in KENO-IV/CG lend themselves to further development. Therefore, it represents a step forward in the state-of-the-art for Monte Carlo criticality analysis. All of the problems analyzed with KENO-IV/CG at ORNL have agreed well with experimental data where results are available. KENO-IV/CG provides industry with a very powerful tool for accurately modeling very complex geometries.
Date: January 1, 1978
Creator: West, J.T. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

Linear filtering applied to Monte Carlo criticality calculations

Description: A significant improvement in the acceleration of the convergence of the eigenvalue computed by Monte Carlo techniques has been developed by applying linear filtering theory to Monte Carlo calculations for multiplying systems. A Kalman filter was applied to a KENO Monte Carlo calculation of an experimental critical system consisting of eight interacting units of fissile material. A comparison of the filter estimate and the Monte Carlo realization was made. The Kalman filter converged in five iterations to 0.9977. After 95 iterations, the average k-eff from the Monte Carlo calculation was 0.9981. This demonstrates that the Kalman filter has the potential of reducing the calculational effort of multiplying systems. Other examples and results are discussed. (auth)
Date: January 1, 1975
Creator: Morrison, G.W.; Pike, D.H. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

Assurances associated with Monte Carlo code results

Description: All Monte Carlo computer codes have an uncertainty associated with the final result. This uncertainty (or standard deviation) is due to the sampling method inherent within the Monte Carlo technique. The basic assumptions required for the final result and uncertainty to be valid are (1) the random numbers used are truly random, (2) there is no correlation between histories, (3) the number of histories used is sufficient to represent the problem, and (4) the entire problem is adequately sampled. The first two assumptions are an integral are strongly dependent on how a problem is set up and the number of histories processed. These are items the user has direct control over. This paper examines six aspects of the KENO Monte Carlo code that affect the above-mentioned four assumptions.
Date: June 1, 1995
Creator: Hollenbach, D.F. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

Review of PGDP assessment of criticality safety problems in increasing product assay to 5 wt % /sup 235/U

Description: Paducah Gaseous Diffusion Plant (PGDP) performed an evaluation of the PGDP facilities to determine the feasibility of increasing product assay from 2.0 wt % to 5.0 wt % /sup 235/U and to determine the impact of this increase on plant criticality safety; their conclusions are reported in KY-710. This report critiques the methods used and conclusions reached in KY-710. 4 figures, 5 tables.
Date: February 1, 1985
Creator: Petrie, L.M.; Turner, J.C. & Stewart, G.B.
Partner: UNT Libraries Government Documents Department

Uncertainties associated with the use of the KENO Monte Carlo criticality codes

Description: The KENO multi-group Monte Carlo criticality codes have earned the reputation of being efficient, user friendly tools especially suited for the analysis of situations commonly encountered in the storage and transportation of fissile materials. Throughout their twenty years of service, a continuing effort has been made to maintain and improve these codes to meet the needs of the nuclear criticality safety community. Foremost among these needs is the knowledge of how to utilize the results safely and effectively. Therefore it is important that code users be aware of uncertainties that may affect their results. These uncertainties originate from approximations in the problem data, methods used to process cross sections, and assumptions, limitations and approximations within the criticality computer code itself. 6 refs., 8 figs., 1 tab.
Date: January 1, 1989
Creator: Landers, N.F. & Petrie, L.M. (Oak Ridge National Lab., TN (USA))
Partner: UNT Libraries Government Documents Department

Improved criticality search techniques for low and high enriched systems

Description: A new automated search technique has been developed to improve the computational efficiency of performing criticality searches on low and high enriched systems with codes such as ANISN and KENO-IV. The technique employs a least-squares fit to a cubic polynomial on parameter values that have been previously generated either by the Extended Mean Value Theorem (EMVT) or by previous curve fits. The solution of the cubic for its roots at the desired value of K-effective completes one pass for the fixed value search while the solution of its derivative provides information about maximum values. This new search technique has been implemented in a FORTRAN routine called OPTMIZ which will eventually be part of a module in the SCALE system.
Date: January 1, 1979
Creator: Lorek, M.J.; Dodds, H.L.; Petrie, L.M. & Westfall, R.M.
Partner: UNT Libraries Government Documents Department

Use of metal/uranium mixtures to explore data uncertainties

Description: A table of k{sub {infinity}} values for three homogenized metal/{sup 235}U systems calculated using both MCNP and the SCALE code system was presented in Ref. 3. The homogenized metal/{sup 235} U ratios were selected such that the MCNP analyses for each mixture provided k{sub {infinity}} {approx_equal} 1.0. The metals considered were Al, Zr, and Fe. These simplified systems were created in an effort to ease an investigation of discrepant results obtained using MCNP and SCALE to analyze large, dry systems of metal-clad, highly enriched fuel assemblies. Reference 3 has received considerable attention at ORNL and elsewhere because the reported k{sub {infinity}} values varied by as much as 38% between the MCNP results and those of SCALE. The ORNL approach was to analyze the systems using a broad range of codes and data and to seek an understanding of the discrepancies by studying differences in the basic data and processing methods. The continuous-energy codes and data applied in the ORNL study were (1) MCNP, using ENDF/B-V, ENDF/B-VI, and LANL data evaluations, (2) VIM, using ENDF/B-V data, and (3) MONK, using a 8,200-point library based on UKNDL evaluations and a preliminary JEF library. The VIM code provides treatment of unresolved resonances; MCNP does not. The MONK analyses provided a result using both an independent code and independent data evaluations. Although accessing continuous-energy data typically requires the use of Monte Carlo codes, 1-D deterministic codes can be used to accurately calculate K{sub {infinity}} values using a variety of multigroup data libraries and processing methods. The multigroup codes used in the study were MC and the CSAS1X sequence of the SCALE system. Both systems provide problem-dependent resonance processing of cross-section data and available fine-group libraries were used for the analyses. Broad-group libraries were not studied in any depth because there were non-readily available for ...
Date: December 1995
Creator: Parks, C. V.; Jordan, W. C.; Petrie, L. M. & Wright, R. Q.
Partner: UNT Libraries Government Documents Department

Comparison of the CENTRM resonance processor to the NITAWL resonance processor in SCALE

Description: This report compares the MTAWL and CENTRM resonance processors in the SCALE code system. The cases examined consist of the International OECD/NEA Criticality Working Group Benchmark 20 problem. These cases represent fuel pellets partially dissolved in a borated solution. The assumptions inherent to the Nordheim Integral Treatment, used in MTAWL, are not valid for these problems. CENTRM resolves this limitation by explicitly calculating a problem dependent point flux from point cross sections, which is then used to create group cross sections.
Date: January 1, 1998
Creator: Hollenbach, D.F. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

Description: SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries.
Date: April 1995
Creator: Petrie, L. M.; Jordon, W. C. & Edwards, A. L.
Partner: UNT Libraries Government Documents Department

SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

Description: SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.
Date: April 1995
Creator: Landers, N. F.; Petrie, L. M. & Knight, J. R.
Partner: UNT Libraries Government Documents Department

New enhancements to SCALE for criticality safety analysis

Description: As the speed, available memory, and reliability of computer hardware increases and the cost decreases, the complexity and usability of computer software will increase, taking advantage of the new hardware capabilities. Computer programs today must be more flexible and user friendly than those of the past. Within available resources, the SCALE staff at Oak Ridge National Laboratory (ORNL) is committed to upgrading its computer codes to keep pace with the current level of technology. This paper examines recent additions and enhancements to the criticality safety analysis sections of the SCALE code package. These recent additions and enhancements made to SCALE can be divided into nine categories: (1) new analytical computer codes, (2) new cross-section libraries, (3) new criticality search sequences, (4) enhanced graphical capabilities, (5) additional KENO enhancements, (6) enhanced resonance processing capabilities, (7) enhanced material information processing capabilities, (8) portability of the SCALE code package, and (9) other minor enhancements, modifications, and corrections to SCALE. Each of these additions and enhancements to the criticality safety analysis capabilities of the SCALE code system are discussed below.
Date: September 1, 1995
Creator: Hollenbach, D.F.; Bowman, S.M.; Petrie, L.M. & Parks, C.V.
Partner: UNT Libraries Government Documents Department

Computational methods for criticality safety analysis within the scale system

Description: The criticality safety analysis capabilities within the SCALE system are centered around the Monte Carlo codes KENO IV and KENO V.a, which are both included in SCALE as functional modules. The XSDRNPM-S module is also an important tool within SCALE for obtaining multiplication factors for one-dimensional system models. This paper reviews the features and modeling capabilities of these codes along with their implementation within the Criticality Safety Analysis Sequences (CSAS) of SCALE. The CSAS modules provide automated cross-section processing and user-friendly input that allow criticality safety analyses to be done in an efficient and accurate manner. 14 refs., 2 figs., 3 tabs.
Date: January 1, 1986
Creator: Parks, C.V.; Petrie, L.M.; Landers, N.F. & Bucholz, J.A.
Partner: UNT Libraries Government Documents Department

A user's guide for the STARTER computer program

Description: The STARTER computer code is used to prepare a starting source distribution for the criticality computer code, KENO V.a. The input description and theoretical basis of the STARTER code are described in this user's guide.
Date: December 1, 1991
Creator: Childs, R.L.; Petrie, L.M. & Landers, N.F.
Partner: UNT Libraries Government Documents Department

Revised uranium--plutonium cycle PWR and BWR models for the ORIGEN computer code

Description: Reactor physics calculations and literature searches have been conducted, leading to the creation of revised enriched-uranium and enriched-uranium/mixed-oxide-fueled PWR and BWR reactor models for the ORIGEN computer code. These ORIGEN reactor models are based on cross sections that have been taken directly from the reactor physics codes and eliminate the need to make adjustments in uncorrected cross sections in order to obtain correct depletion results. Revised values of the ORIGEN flux parameters THERM, RES, and FAST were calculated along with new parameters related to the activation of fuel-assembly structural materials not located in the active fuel zone. Recommended fuel and structural material masses and compositions are presented. A summary of the new ORIGEN reactor models is given.
Date: September 1, 1978
Creator: Croff, A. G.; Bjerke, M. A.; Morrison, G. W. & Petrie, L. M.
Partner: UNT Libraries Government Documents Department

A user`s guide for the STARTER computer program. Consolidated Fuel Reprocessing Program

Description: The STARTER computer code is used to prepare a starting source distribution for the criticality computer code, KENO V.a. The input description and theoretical basis of the STARTER code are described in this user`s guide.
Date: December 1, 1991
Creator: Childs, R. L.; Petrie, L. M. & Landers, N. F.
Partner: UNT Libraries Government Documents Department

Sample problems for the novice user of the AMPX-II system. [For generating coupled multigroup neutron--gamma libraries, in FORTRAN IV for IBM 360/91]

Description: Contents of the IBM version of the APMX system distributed by the Radiation Shielding Information Center (APMX-II) are described. Sample problems which demonstrate the procedure for implementing AMPX-II modules to generate point cross sections; generate multigroup neutron, photon production, and photon interaction cross sections for various transport codes; collapse multigroup cross sections; check, edit, and punch multigroup cross sections; and execute a one-dimensional discrete ordinates transport calculation are detailed. 25 figures, 9 tables.
Date: January 1, 1979
Creator: Ford, W.E. III; Roussin, R.W.; Petrie, L.M.; Diggs, B.R. & Comolander, H.E.
Partner: UNT Libraries Government Documents Department

CSRL-V ENDF/B-V library and thermal reactor and criticality safety benchmarks

Description: CSRL-V, an ENDF/B-V 227-group neutron cross-section library which has recently been expanded to include Bondarenko factor data for unresolved resonance processing, was used to calculate performance parameters for a series of thermal reactor and criticality safety benchmarks. Among the thermal benchmarks calculated were the Babcock and Wilcox lattice critical experiments B and W-XIII and B and W-XX. These two slightly-enriched (2.46%) UO/sub 2/, water-moderated, tight-pitch lattice experiments were chosen because (a) they have similar U/sup 238/ resonance shielding characteristics as power reactor cores, and (b) they provide benchmark results representative of high-leakage and low-leakage lattices, respectively. Among the criticality safety benchmarks calculated were homogeneous, highly-enriched (93.2%) uranyl fluoride spheres with hydrogen-to-uranium ratios varying from 76 to 972.
Date: January 1, 1982
Creator: Ford, W.E. III; Diggs, B.R.; Knight, J.R.; Greene, N.M.; Petrie, L.M. & Williams, M.L.
Partner: UNT Libraries Government Documents Department

SCALE Graphical Developments for Improved Criticality Safety Aalyses

Description: New computer graphic developments at Oak Ridge National Ridge National Laboratory (ORNL) are being used to provide visualization of criticality safety models and calculational results as well as tools for criticality safety analysis input preparation. The purpose of this paper is to present the status of current development efforts to continue to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system. Applications for criticality safety analysis in the areas of 3-D model visualization, input preparation and execution via a graphical user interface (GUI), and two-dimensional (2-D) plotting of results are discussed.
Date: September 20, 1999
Creator: Barnett, D. L.; Bowman, S. M.; Horwedel, J. E. & Petrie, L. M.
Partner: UNT Libraries Government Documents Department

Recent validation experience with multigroup cross-section libraries and scale

Description: This paper will discuss the results obtained and lessons learned from an extensive validation of new ENDF/B-V and ENDF/B-VI multigroup cross-section libraries using analyses of critical experiments. The KENO V. a Monte Carlo code in version 4.3 of the SCALE computer code system was used to perform the critical benchmark calculations via the automated SCALE sequence CSAS25. The cross-section data were processed by the SCALE automated problem-dependent resonance-processing procedure included in this sequence. Prior to calling KENO V.a, CSAS25 accesses BONAMI to perform resonance self-shielding for nuclides with Bondarenko factors and NITAWL-II to process nuclides with resonance parameter data via the Nordheim Integral Treatment.
Date: December 1, 1995
Creator: Bowman, S.M.; Wright, R.Q.; DeHart, M.D.; Parks, C.V. & Petrie, L.M.
Partner: UNT Libraries Government Documents Department

Benchmarking of the 99-group ANSL-V library

Description: The purpose of this paper is to present thermal benchmark data testing results for the BAPL-1, TRX-1, and SEEP-1 lattices, using selected processed cross-sections from the ANSL-V 99-group library. 7 refs., 1 tab.
Date: January 1, 1987
Creator: Wright, R.Q.; Ford, W.E. III; Greene, N.M.; Petrie, L.M.; Primm, R.T. III & Westfall, R.M.
Partner: UNT Libraries Government Documents Department