Quality Assessment and Enhancement of Crowdsourced Video Annotations

Caitlin Ruby, Cooperative Institute for Research in Environmental Sciences (CIRES), Boulder, CO, United States; University of Colorado (CIRES) / NOAA NCEI Affiliate, Boulder, United States, Megan Cromwell, NOAA, Stennis Space Center, United States, Susan Talley Gottfried, Mississippi State University, Northern Gulf Institute, NOAA National Centers for Environmental Information (NCEI) Affiliate, Stennis Space Center, United States and Mashkoor Malik, NOAA, Office of Ocean Exploration and Research, Silver Spring, MD, United States
Abstract:
Textual observations of underwater video, more commonly referred to as video annotations, are essential in expanding current knowledge of complex marine ecosystems, seafloor processes, and exploitable/vulnerable resources. Although knowledge extraction from video is laborious, gained knowledge provides insight unattainable from other data sources alone. Therefore, it is vital that video annotations support optimal discoverability and use of data for generations to come. NOAA’s Office of Ocean Exploration and Research (OER) and NOAA's National Centers for Environmental Information (NCEI) partnered with Ocean Networks Canada (ONC) to employ ONC’s online video annotation interface - SeaTube - to improve the accessibility of video collected by ROV Deep Discoverer onboard NOAA Ship Okeanos Explorer. SeaTube allows registered users (i.e., scientists, researchers) to both actively and retroactively annotate underwater video using a combination of suggested textual tags and free text descriptors. The supporting video annotations have been strengthened through crowdsourcing and the integration of WoRMS (World Register of Marine Species) and CMECS (Coastal Marine Ecological Classification Standard) terminology. Expanding the annotation vocabulary enriches the data by allowing users to search through the video using relatable identifiers (e.g., mollusca vs. mollusk). Analysis of available annotation records reveal that video annotations may vary greatly in completeness and accuracy. Goals are to establish and analyze the completeness and accuracy of current annotation methods.