244 Matching Results

Search Results

Advanced search parameters have been applied.

Standards Yearbook 1932

Description: Outlines of the activities and accomplishments of the national and international standardization agencies.
Date: 1932
Creator: United States. National Bureau of Standards.
Partner: UNT Libraries Government Documents Department

Global Standards: Building Blocks for the Future

Description: There are standards to protect the environment and human health and safety, and to mediate commercial transactions. Other standards ensure that different products are compatible when hooked together. This report is looking across industry sectors; it evaluates the U.S. standards setting process in the light of its changing economic and technological environment, and compares it to processes in other countries.
Date: March 1992
Creator: United States. Congress. Office of Technology Assessment.
Partner: UNT Libraries Government Documents Department

Towards richer descriptions of our collection of genomes andmetagenomes

Description: In this commentary, we advocate building a richer set of descriptions about our invaluable and exponentially growing collection of genomes and metagenomic datasets through the construction of consensus-driven data capture and exchange mechanisms. Standardization activities must proceed within the auspices of open-access and international working bodies, and to tackle the issues surrounding the development of better descriptions of genomic investigations we have formed the Genomic Standards Consortium (GSC). Here, we introduce the 'Minimum Information about a Genome Sequence' specification in the hopes of gaining wider participation in its development and discuss the resources that will be required to support it (standardization of annotations through the use of ontologies and mechanisms of metadata capture, exchange). As part of its wider goals, the GSC also strongly supports improving the 'transparency' of the information contained in existing genomic databases that contain calculated analyses and genomic annotations.
Date: June 1, 2006
Creator: Field, D.; Garrity, G; Selengut, J.; Sterk, P.; Tatusova, T.; Thomson, N. et al.
Partner: UNT Libraries Government Documents Department


Description: ISO/IEC-11179 is an international standard that documents the standardization and registration of metadata to make data understandable and shareable. This standardization and registration allows for easier locating, retrieving, and transmitting data from disparate databases. The standard defines the how metadata are conceptually modeled and how they are shared among parties, but does not define how data is physically represented as bits and bytes. The standard consists of six parts. Part 1 provides a high-level overview of the standard and defines the basic element of a metadata registry - a data element. Part 2 defines the procedures for registering classification schemes and classifying administered items in a metadata registry (MDR). Part 3 specifies the structure of an MDR. Part 4 specifies requirements and recommendations for constructing definitions for data and metadata. Part 5 defines how administered items are named and identified. Part 6 defines how administered items are registered and assigned an identifier.
Date: January 3, 2008
Creator: Pon, R K & Buttler, D J
Partner: UNT Libraries Government Documents Department

Standards Committee Activities of the National Bureau of Standards: 1982 Highlights

Description: Abstract: This report summarizes NBS standards committee activities and accomplishments during, calendar year 1982. It describes the management of standards activities at NBS, profiles NBS staf* participation on outside standards committees, and highlights significant technical and individual contributions made by NBS staff. In 1982, 457 staff members (or 29% of NBS' professional, scientific, and technical staff) participated in 1,046 outside standards committees of 97 national and international standards organizations.
Date: March 1983
Creator: Newell, Karl G., Jr.
Partner: UNT Libraries Government Documents Department

Standards Committee Activities of the National Bureau of Standards: 1983 Highlights

Description: Abstract: This report summarizes NBS standards committee activities and accomplishments during calendar year 1983. It profiles NBS state participation on outside standards committees and highlights significant technical and individual contributions made by NBS staff. In 1983, 446 staff members (or 28% of NBS' professionals, scientific, and technical staff) participated in 989 standards committees of 87 national and international standards organizations.
Date: April 1984
Creator: Newell, Karl G., Jr.
Partner: UNT Libraries Government Documents Department

New Security Results on Encrypted Key Exchange

Description: Schemes for encrypted key exchange are designed to provide two entities communicating over a public network, and sharing a (short) password only, with a session key to be used to achieve data integrity and/or message confidentiality. An example of a very efficient and ''elegant'' scheme for encrypted key exchange considered for standardization by the IEEE P1363 Standard working group is AuthA. This scheme was conjectured secure when the symmetric-encryption primitive is instantiated via either a cipher that closely behaves like an ''ideal cipher,'' or a mask generation function that is the product of the message with a hash of the password. While the security of this scheme in the former case has been recently proven, the latter case was still an open problem. For the first time we prove in this paper that this scheme is secure under the assumptions that the hash function closely behaves like a random oracle and that the computational Diffie-Hellman problem is difficult. Furthermore, since Denial-of-Service (DoS) attacks have become a common threat we enhance AuthA with a mechanism to protect against them.
Date: December 15, 2003
Creator: Bresson, Emmanuel; Chevassut, Olivier & Pointcheval, David
Partner: UNT Libraries Government Documents Department


Description: CEBAF 12GeV upgrade project includes 80 new 7-cell cavities to form 10 cryomodules. Each cavity underwent RF qualification at 2.07K using a high power accelerating gradient test and an HOM survey in Jefferson Lab's Vertical Testing Area (VTA) before cavity string assembly. In order to ensure consistently high quality data, updated cavity testing procedures and analysis were implemented and used by a group of VTA operators. For high power tests, a cavity testing procedure was developed and used in conjunction with a LabVIEW program to collect the test data. Additionally while the cavity was at 2.07K, an HOM survey was performed using a network analyzer and a combination of Excel and Mathematica programs. Data analysis was standardized and an online logbook, Pansophy, was used for data storage and mining. The Pansophy system allowed test results to be easily summarized and searchable across all cavity tests. In this presentation, the CEBAF 12GeV upgrade cavity testing procedure, method for data analysis, and results reporting results will be discussed.
Date: July 1, 2012
Creator: Tiffany Bass, G. Davis, Christiana Wilson, Mircea Stirbet
Partner: UNT Libraries Government Documents Department

Towards a standards-compliant genomic and metagenomic publication record

Description: Increasingly we are aware as a community of the growing need to manage the avalanche of genomic and metagenomic data, in addition to related data types like ribosomal RNA and barcode sequences, in a way that tightly integrates contextual data with traditional literature in a machine-readable way. It is for this reason that the Genomic Standards Consortium (GSC) formed in 2005. Here we suggest that we move beyond the development of standards and tackle standards-compliance and improved data capture at the level of the scientific publication. We are supported in this goal by the fact that the scientific community is in the midst of a publishing revolution. This revolution is marked by a growing shift away from a traditional dichotomy between 'journal articles' and 'database entries' and an increasing adoption of hybrid models of collecting and disseminating scientific information. With respect to genomes and metagenomes and related data types, we feel the scientific community would be best served by the immediate launch of a central repository of short, highly structured 'Genome Notes' that must be standards-compliant. This could be done in the context of an existing journal, but we also suggest the more radical solution of launching a new journal. Such a journal could be designed to cater to a wide range of standards-related content types that are not currently centralized in the published literature. It could also support the demand for centralizing aspects of the 'gray literature' (documents developed by institutions or communities) such as the call by the GSCl for a central repository of Standard Operating Procedures describing the genomic annotation pipelines of the major sequencing centers. We argue that such an 'eJournal', published under the Open Access paradigm by the GSC, could be an attractive publishing forum for a broader range of standardization initiatives within, and beyond, ...
Date: April 1, 2008
Creator: Fenner, Marsha W; Garrity, George M.; Field, Dawn; Kyrpides, Nikos; Hirschman, Lynette; San-sone, Susanna-Assunta et al.
Partner: UNT Libraries Government Documents Department

Lab-Based Measurement of Remediation Techniques for Radiation Portal Monitors (Initial Report)

Description: Radiation Portal Monitors (RPM) deployed by the Second Line of Defense (SLD) are known to be sensitive to the natural environmental radioactive background. There are several techniques used to mitigate the effects of background on the monitors, but since the installation environments can vary significantly from one another the need for a standardized, systematic, study of remediation techniques was proposed and carried out. This study is not meant to serve as the absolute last word on the subject. The data collected are, however, intelligible and useful. Some compromises were made, each of which will be described in detail. The hope of this initial report is to familiarize the SLD science teams with ORNL's effort to model the effect of various remediation techniques on simple, static backgrounds. This study provides a good start toward benchmarking the model, and each additional increment of data will serve to make the model more robust. The scope of this initial study is limited to a few basic cases. Its purpose is to prove the utility of lab-based study of remediation techniques and serve as a standard data set for future use. This importance of this first step of standardization will become obvious when science teams are working in parallel on issues of remediation; having a common starting point will do away with one category of difference, thereby making easier the task of determining the sources of disagreement. Further measurements will augment this data set, allowing for further constraint of the universe of possible situations. As will be discussed in the 'Going Forward' section, more data will be included in the final report of this work. Of particular interest will be the data taken with the official TSA lead collimators, which will provide more direct results for comparison with installation data.
Date: February 1, 2012
Creator: Livesay, Jake; Guzzardo, Tyler & Lousteau, Angela L
Partner: UNT Libraries Government Documents Department

Precision Measurement and Calibration: Electricity: Selected Papers on the Realization and Maintenance of the Fundamental Electrical Units and Related Topics

Description: Abstract: Selected publications of the National Bureau of Standards technical staff in the field of electricity were first compiled in 1962 as a volume of the NBS Precision Measurement and Calibration Series (Electricity and Electronics, Handbook 77, Volume I); this compilation was extended in 1968 by the compilation of an additional volume in the Precision Measurement and Calibration Series (Electricity-Low Frequency, NBS Special Publication 300, Volume 3). The present volume, a further extension of these earlier compilations of selected publications in the field of electricity, includes 66 more recent papers by NBS authors and 16 abstracts of closely related papers by authors from other organizations. In view of the expansion of measurement technologies used in electricity and electromagnetism it has been necessary to reduce the range of topics for the selection of papers in the new compilation. In this connection an emphasis has been placed upon the realization and maintenance of fundamental electrical units and the related scientific advances, particularly in quantum physics. However, in the interest of completeness, three appendices also provide up-to-date bibliographies of publications by NBS authors in different areas of electromagnetism. (This book is a sequel to NBS Handbook 77-Vol. 1(1961) and NBS SP 300-Vol. 3 (1968).
Date: October 1985
Creator: McCoubrey, Arthur O.
Partner: UNT Libraries Government Documents Department

Teaching Past the Test: a Pedagogy of Critical Pragmatism

Description: Existent scholarship in communication studies has failed to adequately address the particular pedagogical context of current public secondary education within the United States. While communication studies has produced a great deal of scholarship centered within the framework of critical pedagogy, these efforts fail to offer public high school teachers in the U.S. a tenable alternative to standardized constructs of educational communication. This thesis addresses the need for a workable, critical pedagogy in this particular educational context as a specific question of educational communication. a theorization of pedagogical action drawing from critical pedagogy, pragmatism, and communication studies termed “critical pragmatism” is offered as an effective, critical counter point to current neoliberal classroom practices in U.S. public secondary schools.
Date: May 2012
Creator: Jordan, Jason
Partner: UNT Libraries

Implications of intelligent, integrated microsystems for product design and development

Description: Intelligent, integrated microsystems combine some or all of the functions of sensing, processing information, actuation, and communication within a single integrated package, and preferably upon a single silicon chip. As the elements of these highly integrated solutions interact strongly with each other, the microsystem can be neither designed nor fabricated piecemeal, in contrast to the more familiar assembled products. Driven by technological imperatives, microsystems will best be developed by multi-disciplinary teams, most likely within the flatter, less hierarchical organizations. Standardization of design and process tools around a single, dominant technology will expedite economically viable operation under a common production infrastructure. The production base for intelligent, integrated microsystems has elements in common with the mathematical theory of chaos. Similar to chaos theory, the development of microsystems technology will be strongly dependent on, and optimized to, the initial product requirements that will drive standardization--thereby further rewarding early entrants to integrated microsystem technology.
Date: April 19, 2000
Partner: UNT Libraries Government Documents Department

Nuclear Powerplant Standardization: Light Water Reactors

Description: A report by the Office of Technology Assessment (OTA) that looks at four approaches to the standardization of light water reactors and "considers these four representative approaches to standardization and examines the major advantages and disadvantages of each concept" (p. 3).
Date: April 1981
Creator: United States. Congress. Office of Technology Assessment.
Partner: UNT Libraries Government Documents Department


Description: The International Organization for Standardization (ISO) has published a Guide to the expression of Uncertainty in Measurement (GUM). The IUPAC Commission on Isotopic Abundance and Atomic Weight (CIAAW) began attaching uncertainty limits to their recommended values about forty years ago. CIAAW's method for determining and assigning uncertainties has evolved over time. We trace this evolution to their present method and their effort to incorporate the basic ISO/GUM procedures into evaluations of these uncertainties. We discuss some dilemma the CIAAW faces in their present method and whether it is consistent with the application of the ISO/GUM rules. We discuss the attempt to incorporate variations in measured isotope ratios, due to natural fractionation, into the ISO/GUM system. We make some observations about the inconsistent treatment in the incorporation of natural variations into recommended data and uncertainties. A recommendation for expressing atomic weight values using a tabulated range of values for various chemical elements is discussed.
Date: July 23, 2007
Creator: Holden, N. E.
Partner: UNT Libraries Government Documents Department

Definition and implementation of a SAML-XACML profile for authorization interoperability across grid middleware in OSG and EGEE

Description: In order to ensure interoperability between middleware and authorization infrastructures used in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects, an Authorization Interoperability activity was initiated in 2006. The interoperability goal was met in two phases: first, agreeing on a common authorization query interface and protocol with an associated profile that ensures standardized use of attributes and obligations; and second, implementing, testing, and deploying, on OSG and EGEE, middleware that supports the interoperability protocol and profile. The activity has involved people from OSG, EGEE, the Globus Toolkit project, and the Condor project. This paper presents a summary of the agreed-upon protocol, profile and the software components involved.
Date: April 1, 2009
Creator: Garzoglio, Gabriele; Alderman, Ian; Altunay, Mine; Anathakrishnan, Rachana; Bester, Joe; Chadwick, Keith et al.
Partner: UNT Libraries Government Documents Department

Security Implications of OPC, OLE, DCOM, and RPC in Control Systems

Description: OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding], DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.
Date: January 1, 2006
Partner: UNT Libraries Government Documents Department

Beryllium Sampling and Analysis Within the DOE Complex and Opportunities for Standardization

Description: Since the U. S. Department of Energy (DOE) published the DOE Beryllium Rule (10 CFR 850) in 1999, DOE sites have been required to measure beryllium on air filters and wipes for worker protection and for release of materials from beryllium-controlled areas. Measurements in the nanogram range on a filter or wipe are typically required. Industrial hygiene laboratories have applied methods from various analytical compendia, and a number of issues have emerged with sampling and analysis practices. As a result, a committee of analytical chemists, industrial hygienists, and laboratory managers was formed in November 2003 to address the issues. The committee developed a baseline questionnaire and distributed it to DOE sites and other agencies in the U.S. and U.K. The results of the questionnaire are presented in this paper. These results confirmed that a wide variety of practices were in use in the areas of sampling, sample preparation, and analysis. Additionally, although these laboratories are generally accredited by the American Industrial Hygiene Association (AIHA), there are inconsistencies in performance among accredited labs. As a result, there are significant opportunities for development of standard methods that could improve consistency. The current availabilities and needs for standard methods are further discussed in a companion paper.
Date: January 25, 2005
Partner: UNT Libraries Government Documents Department

National Incident Management System (NIMS) Standards Review Panel Workshop Summary Report

Description: The importance and need for full compliant implementation of NIMS nationwide was clearly demonstrated during the Hurricane Katrina event, which was clearly expressed in Secretary Chertoff's October 4, 2005 letter addressed to the State's governors. It states, ''Hurricane Katrina was a stark reminder of how critical it is for our nation to approach incident management in a coordinated, consistent, and efficient manner. We must be able to come together, at all levels of government, to prevent, prepare for, respond to, and recover from any emergency or disaster. Our operations must be seamless and based on common incident management doctrine, because the challenges we face as a nation are far greater than capabilities of any one jurisdiction.'' The NIMS is a system/architecture for organizing response on a ''national'' level. It incorporations ICS as a main component of that structure (i.e., it institutionalizes ICS in NIMS). In a paper published on the NIMS Website, the following statements were made: ''NIMS represents a core set of doctrine, principles, terminology, and organizational processes to enable effective, efficient and collaborative incident management at all levels. To provide the framework for interoperability and compatibility, the NIMS is based on a balance between flexibility and standardization.'' Thus the NIC is challenged with the need to adopt quality SDO generated standards to support NIMS compliance, but in doing so maintain the flexibility necessary so that response operations can be tailored for the specific jurisdictional and geographical needs across the nation. In support of this large and complex challenge facing the NIC, the Pacific Northwest National Laboratory (PNNL) was asked to provide technical support to the NIC, through their DHS Science and Technology ? Standards Portfolio Contract, to help identify, review, and develop key standards for NIMS compliance. Upon examining the challenge, the following general process appears to be ...
Date: February 7, 2006
Creator: Stenner, Robert D.; Kirk, Jennifer L.; Stanton, James R.; Shebell, Peter; Schwartz, Deborah S.; Judd, Kathleen S. et al.
Partner: UNT Libraries Government Documents Department

Mapping Physical Formats to Logical Models to Extract Data and Metadata: The Defuddle Parsing Engine

Description: Scientists, fueled by the desire for systems-level understanding of phenomena, increasingly need to share their results across multiple disciplines. Accomplishing this requires data to be annotated, contextualized, and readily searchable and translated into other formats. While these requirements can be addressed by custom programming or obviated by community standardization, neither approach has ‘solved’ the problem. In this paper, we describe a complementary approach – a general capability for articulating the format of arbitrary textual and binary data using a logical data model, expressed in XML-Schema, which can be used to provide annotation and context, extract metadata, and enable translation. This work is based on the draft specification for the Data Format Description Language and our open source “Defuddle” parser. We present an overview of the specification, detail the design of Defuddle, and discuss the benefits and challenges of this general approach to enabling discovery and sharing of diverse data sets.
Date: July 25, 2006
Creator: Talbott, Tara D.; Schuchardt, Karen L.; Stephan, Eric G. & Myers, James D.
Partner: UNT Libraries Government Documents Department

Digital Preservation of Newspapers: Findings of the Chronicles in Preservation Project

Description: In this paper, the authors describe research led by Educopia Institute regarding the preservation needs for digitized and born-digital newspapers. The 'Chronicles in Preservation' project, builds upon previous efforts (e.g. the U.S. National Digital Newspaper Program) to look more broadly at the needs of digital newspapers in all of their diverse and challenging forms. This paper conveys the findings of the first research phase, including substantive survey results regarding digital newspaper curation practices.
Date: October 2012
Creator: Skinner, Katherine; Schultz, Matt; Halbert, Martin & Phillips, Mark Edward
Partner: UNT Libraries