3 Matching Results

Search Results

Advanced search parameters have been applied.

Atmospheric Radiation Measurement Program Climate Research Facility Operations Quarterly Report October 1–December 31, 2009

Description: Individual raw datastreams from instrumentation at the Atmospheric Radiation Measurement (ARM) Climate Research Facility fixed and mobile sites are collected and sent to the Data Management Facility (DMF) at Pacific Northwest National Laboratory (PNNL) for processing in near real-time. Raw and processed data are then sent approximately daily to the ARM Data Archive, where they are made available to users. For each instrument, we calculate the ratio of the actual number of data records received daily at the Archive to the expected number of data records. The results are tabulated by (1) individual datastream, site, and month for the current year and (2) site and fiscal year (FY) dating back to 1998.
Date: January 15, 2010
Creator: Sisterson, D. L.
Partner: UNT Libraries Government Documents Department

A new paradigm for geosciences information management

Description: Over the past two decades, geoscientists have been increasingly engaged in providing answers to complex environmental problems with significant societal, political, and economic consequences. Today, these scientists have to perform under increasingly greater visibility to stakeholders and the general public. Their activities are much more scrutinized with regards to economic pressure, litigation support and regulatory compliance than in the past. Their current work is built on decades of past work and in many cases will continue for decades to come. Stakeholders are increasingly evaluating raw data rather than just examining summaries in final reports. They also need assurance that proper data control and data quality procedures were followed. Geoscientists are now faced with a new paradigm, i.e. with the challenge of cost effectively collecting, managing, analyzing, and synthesizing enormous volumes of multidisciplinary and complex information. In addition, these data must be processed and disseminated in a way that allows the public to make informed and rational assessments on decisions that are proposed or have been made. The new paradigm is clear - client and stakeholder needs must be better met, and the systems used to store and generate data must meet these needs. This paper addresses the challenges and the implications of this new paradigm on geosciences information management in the 21st Century. It concludes with a case study for a successful implementation of the new paradigm in an environmental restoration project at the Los Alamos National Laboratory (LANL) that is operated by the Department of Energy (DOE). LANL is upgrading and reengineering its data and business processes to better address client, user and stakeholder issues regarding data accessibility, control and quality.
Date: January 1, 2002
Creator: Bolivar, Stephen L.; Nasser, K. (Khalil); Dorries, A. M. (Alison M.) & Canepa, Julie Ann
Partner: UNT Libraries Government Documents Department

Scientific Data Management Center for Enabling Technologies

Description: Managing scientific data has been identified by the scientific community as one of the most important emerging needs because of the sheer volume and increasing complexity of data being collected. Effectively generating, managing, and analyzing this information requires a comprehensive, end-to-end approach to data management that encompasses all of the stages from the initial data acquisition to the final analysis of the data. Fortunately, the data management problems encountered by most scientific domains are common enough to be addressed through shared technology solutions. Based on community input, we have identified three significant requirements. First, more efficient access to storage systems is needed. In particular, parallel file system and I/O system improvements are needed to write and read large volumes of data without slowing a simulation, analysis, or visualization engine. These processes are complicated by the fact that scientific data are structured differently for specific application domains, and are stored in specialized file formats. Second, scientists require technologies to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis and searches over extremely large data sets. Specialized feature discovery and statistical analysis techniques are needed before the data can be understood or visualized. Furthermore, interactive analysis requires techniques for efficiently selecting subsets of the data. Finally, generating the data, collecting and storing the results, keeping track of data provenance, data post-processing, and analysis of results is a tedious, fragmented process. Tools for automation of this process in a robust, tractable, and recoverable fashion are required to enhance scientific exploration. The SDM center was established under the SciDAC program to address these issues. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Equally important, it established collaborations ...
Date: January 15, 2013
Creator: Vouk, Mladen A.
Partner: UNT Libraries Government Documents Department