Adaptation of a cubic smoothing spline algortihm for multi-channel data stitching at the National Ignition Facility Page: 3 of 10
This article is part of the collection entitled: Office of Scientific & Technical Information Technical Reports and was provided to UNT Digital Library by the UNT Libraries Government Documents Department.
Extracted Text
The following text was automatically extracted from the image on this page using optical character recognition software:
Adaptation of a cubic smoothing spline algortihm for
multi-channel data stitching at the National Ignition Facility
Charles G. Brown Jr.* a, Aaron B. Adcocka"b, Stephen G. Azevedoa, Judith A. Liebmana, and
Essex J. Bondy
aLawrence Livermore National Laboratory, 7000 East Ave., Livermore, CA, USA 94550-9234;
bStanford University, 450 Serra Mall, Stanford, CA, USA 94305
ABSTRACT
Some diagnostics at the National Ignition Facility (NIF), including the Gamma Reaction History (GRH) di-
agnostic, require multiple channels of data to achieve the required dynamic range. These channels need to be
stitched together into a single time series, and they may have non-uniform and redundant time samples. We
chose to apply the popular cubic smoothing spline technique to our stitching problem because we needed a
general non-parametric method. We adapted one of the algorithms in the literature, by Hutchinson and deHoog,
to our needs. The modified algorithm and the resulting code perform a cubic smoothing spline fit to multiple
data channels with redundant time samples and missing data points. The data channels can have different, time-
varying, zero-mean white noise characteristics. The method we employ automatically determines an optimal
smoothing level by minimizing the Generalized Cross Validation (GCV) score. In order to automatically validate
the smoothing level selection, the Weighted Sum-Squared Residual (WSSR) and zero-mean tests are performed
on the residuals. Further, confidence intervals, both analytical and Monte Carlo, are also calculated. In this
paper, we describe the derivation of our cubic smoothing spline algorithm. We outline the algorithm and test it
with simulated and experimental data.
Keywords: Cubic spline, smoothing
1. INTRODUCTION
The National Ignition Facility (NIF) is a 192-beam pulsed laser system completed in May 2009 at the Lawrence
Livermore National Laboratory (LLNL) and now producing experimental results for the study of inertial con-
finement fusion and the physics of extreme energy densities and pressures [1]. The initial goals of NIF include
demonstration of thermonuclear burn (ignition) of deuterium and tritium fuel (D-T) in a laboratory setting.
One of the ways to measure the energy yield over very short time-scales (20 ns) and large dynamic range is with
carefully-timed measurements of gamma-rays emitted by the imploding D-T target [2]. The Gamma Reaction
History (GRH) diagnostic has been successfully deployed for this purpose at the OMEGA laser [3] and is now
operational at NIF, producing many time waveforms for each imploded target shot. To obtain the required
orders-of-magnitude in dynamic range for GRH, experimenters are recording multiplexed channels of the same
event, each with different voltage offsets, attenuation, and even time scales. The absolute noise levels or uncer-
tainties for each channel are different, and some of the channels may have saturated regions that cannot be used.
The goal of this paper is to describe a method that accurately estimates the "true" signal, with estimated error
bars, that combines all data from the multiplexed noisy waveforms into a single composite, or "stitched", signal.
A "stitching" method for NIF data must be automatic, robust, and yield high-quality results. One ap-
proach to spline smoothing with repeated time samples is addressed in [4], which models the true signal as a
stochastic process and solves for its estimate using a Kalman filter approach. However, based on past work by
other researchers on OMEGA GRH data and ease of implementation, we chose to modify the cubic smoothing
spline technique in [5] to our needs. The modified algorithm and the resulting code perform a cubic smoothing
spline fit to multiple data channels with redundant time samples and missing data points. The data channels
can have different, time-varying, zero-mean white noise characteristics. The method we employ automatically
*brown207@llnl.gov
Upcoming Pages
Here’s what’s next.
Search Inside
This article can be searched. Note: Results may vary based on the legibility of text within the document.
Tools / Downloads
Get a copy of this page or view the extracted text.
Citing and Sharing
Basic information for referencing this web page. We also provide extended guidance on usage rights, references, copying or embedding.
Reference the current page of this Article.
Brown, C; Adcock, A; Azevedo, S; Liebman, J & Bond, E. Adaptation of a cubic smoothing spline algortihm for multi-channel data stitching at the National Ignition Facility, article, December 28, 2010; Livermore, California. (https://digital.library.unt.edu/ark:/67531/metadc829500/m1/3/: accessed March 29, 2024), University of North Texas Libraries, UNT Digital Library, https://digital.library.unt.edu; crediting UNT Libraries Government Documents Department.