S41B-4465:
The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

Thursday, 18 December 2014
Charlotte A Rowe1, Richard J Stead1, Michael L Begnaud2, Deyan Dragonov3, Monica Maceira2 and Martin Gomez4, (1)Los Alamos National Laboratory - LANL, Earth and Environmental Science, Los Alamos, NM, United States, (2)Los Alamos National Laboratory, Los Alamos, NM, United States, (3)Delft University of Technology, Delft, Netherlands, (4)Argentina National Atomic Energy Commission, Bariloche, Argentina
Abstract:
A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).