Navigating a Sea of Big Data

Monday, 15 December 2014
Danie Kinkade, Cynthia L Chandler, Robert C Groman, Adam Shepherd, Molly D Allison, Shannon Rauch, Peter H Wiebe and David M Glover, Woods Hole Oceanographic Inst, Woods Hole, MA, United States
Oceanographic research is evolving rapidly. New technologies, strategies, and related infrastructures have catalyzed a change in the nature of oceanographic data. Heterogeneous and complex data types can be produced and transferred at great speeds. This shift in volume, variety, and velocity of data produced has led to increased challenges in managing these Big Data. In addition, distributed research communities have greater needs for data quality control, discovery and public accessibility, and seamless integration for interdisciplinary study. Organizations charged with curating oceanographic data must also evolve to meet these needs and challenges, by employing new technologies and strategies.

The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in 2006, to fulfill the data management needs of investigators funded by the NSF Ocean Sciences Biological and Chemical Sections and Polar Programs Antarctic Organisms and Ecosystems Program. Since its inception, the Office has had to modify internal systems and operations to address Big Data challenges to meet the needs of the ever-evolving oceanographic research community. Some enhancements include automated procedures replacing labor-intensive manual tasks, adoption of metadata standards facilitating machine client access, a geospatial interface and the use of Semantic Web technologies to increase data discovery and interoperability. This presentation will highlight some of the BCO-DMO advances that enable us to successfully fulfill our mission in a Big Data world.