Advanced Camera Technologies and Artificial Intelligence to Improve Marine Resource Surveys

Benjamin Richards1, Anthony Hoogs2, Matthew David Dawkins3, Jeremy Taylor4, Steven G Smith5, Jerald S Ault5 and Michael P Seki6, (1)NOAA, Honolulu, HI, United States, (2)Kitware, Clifton Park, United States, (3)Kitware, Saratoga Springs, NY, United States, (4)Joint Institute for Marine and Atmospheric Research, Honolulu, HI, United States, (5)University of Miami, Rosenstiel School of Marine and Atmospheric Science, Miami, FL, United States, (6)US Dept Commerce/NOAA/NMFS, Honolulu, HI, United States
Abstract:
Marine scientists are increasingly pairing advanced camera technologies and artificial intelligence to improve fishery-independent surveys. In the US, the NOAA Pacific Islands Fisheries Science Center began developing a multi-gear survey for the Hawaii “Deep7” bottomfish stock in 2011. The survey became operational in 2016, incorporating cooperative research fishing and a new modular stereo-camera system. Although the cameras provide accurate and precise species-specific, size-structured abundance estimates for the “Deep7” stock assessment, it takes a team of video analysts over 3 months to process the nearly 9 million images produced per survey. In an effort to improve this type of image analysis pipeline, the NOAA Fisheries Office of Science and Technology began developing machine learning capabilities to automate the annotation of these types of images in 2015. In early 2019, the Automated Image Analysis Strategic Initiative team released two open-source software products, CoralNet and VIAME, which automate benthic photoquadrat and in situ image or video annotation. Both tools can greatly reduce the time and human effort required to convert survey images into actionable data.