New open-source image database unlocks the power of AI for ocean exploration — ScienceDaily

A new collaboration between MBARI and other research institutions is harnessing the power of artificial intelligence and machine learning to accelerate ocean exploration efforts.

To manage the impacts of climate change and other threats, researchers urgently need to learn more about the ocean’s inhabitants, ecosystems and processes. As scientists and engineers develop advanced robotics that can visualize marine life and environments to monitor changes in ocean health, they face a fundamental problem: the collection of images, videos, and other visual data far exceeds the analytical capacity of researchers .

FathomNet is an open-source image database that uses state-of-the-art data processing algorithms to process the visual data backlog. The use of artificial intelligence and machine learning will remove the bottleneck in analyzing underwater images and accelerate important research into ocean health.

“A big ocean needs big data. Researchers collect large amounts of visual data to observe ocean life. How can we process all this information without automation? Machine learning offers a way forward, but these approaches rely on huge datasets for training. FathomNet was built to fill this gap,” said MBARI Chief Engineer Kakani Katija.

Project co-founders Katija, Katy Croff Bell (Ocean Discovery League) and Ben Woodward (CVISION AI), along with members of the extended FathomNet team, detailed the development of this new image database in a recent research publication Scientific Reports.

Recent advances in machine learning allow for rapid, sophisticated analysis of visual data, but the use of artificial intelligence in marine research has been limited by the lack of a standard set of existing images that could be used to train the machines to recognize and catalog underwater objects and life. FathomNet meets this need by aggregating imagery from multiple sources to create a publicly available, expert-curated underwater imagery training database.

“Over the past five years, machine learning has revolutionized the automated visual analysis landscape, driven largely by vast collections of labeled data. ImageNet and Microsoft COCO are benchmark datasets for terrestrial applications that machine learning and computer vision researchers are flocking to, but we haven’t even begun to scratch the surface of machine learning capabilities for underwater visual analysis,” said Ben Woodward, Co-Founder and CEO of CVISION AI and Co-Founder of FathomNet “With FathomNet, we aim to provide a rich, interesting benchmark to engage the machine learning community in a new realm.”

Over the past 35 years, MBARI has recorded nearly 28,000 hours of deep-sea video and collected more than 1 million deep-sea images. This treasure trove of visual data was annotated in detail by research technicians in MBARI’s video lab. MBARI’s video archive includes approximately 8.2 million annotations recording observations of animals, habitats and objects. This rich dataset is an invaluable resource for researchers at the institute and collaborators around the world.

FathomNet includes a subset of the MBARI dataset, as well as assets from National Geographic and NOAA.

The National Geographic Society’s Exploration Technology Lab has deployed versions of its autonomous benthic lander platform, the Deep Sea Camera System, since 2010, collecting more than 1,000 hours of video data from locations in all ocean basins and in a variety of marine habitats. These videos were then ingested into CVISION AI’s cloud-based collaborative analytics platform and commented on by subject matter experts from the University of Hawaii and OceansTurn.

The National Oceanic and Atmospheric Administration (NOAA) ocean exploration began collecting video data with a dual remote vehicle system aboard the NOAA vessel Oceanos Explorer in 2010. More than 271 terabytes are archived and publicly available by the NOAA National Centers for Environmental Information (NCEI). NOAA Ocean Exploration originally collected annotations by volunteer participating scientists and in 2015 began supporting senior taxonomists to more thoroughly annotate collected videos.

“FathomNet is a great example of how collaboration and community science can drive breakthroughs in the way we learn about the ocean. With data from MBARI and the other collaborators as the backbone, we hope that FathomNet can help accelerate ocean research at a time when understanding the ocean is more important than ever,” said Lonny Lundsten, Senior Research Technician at MBARI, co-author and member of the FathomNet team.

As an open-source web-based resource, other institutions can contribute to and use FathomNet instead of engaging in traditional, resource-intensive efforts to process and analyze visual data. MBARI launched a pilot program to use machine learning models trained with FathomNet to annotate videos captured by remotely operated underwater vehicles (ROVs). The use of AI algorithms reduced human effort by 81 percent and increased the tagging rate by 10x.

Machine learning models trained on FathomNet data also have the potential to revolutionize ocean exploration and monitoring. For example, equipping robotic vehicles with cameras and improved machine learning algorithms may eventually enable automated search and tracking of marine life and other underwater objects.

“Four years ago we envisioned using machine learning to analyze thousands of hours of ocean video, but at the time this was not possible, largely due to a lack of annotated images, enabling tools that researchers, scientists and the public can use to to accelerate the pace of ocean discovery,” said Katy Croff Bell, Founder and President of the Ocean Discovery League and co-founder of FathomNet.

As of September 2022, FathomNet contained 84,454 images representing 175,875 localizations from 81 separate collections for 2,243 concepts, with more contributions ongoing. FathomNet aims to obtain 1,000 independent observations for more than 200,000 animal species in various poses and imaging conditions – eventually more than 200 million total observations. In order for FathomNet to achieve its intended goals, it will require significant community engagement – including quality contributions across a wide range of groups and individuals – and widespread use of the database.

“While FathomNet is a web-based platform built on top of an API where people can download tagged data to train novel algorithms, we want it to also serve as a community where marine explorers and enthusiasts from diverse backgrounds can share their knowledge and… can bring their expertise and help solve challenges related to visual ocean data that are impossible without broad engagement,” said Katija.

FathomNet: fathomnet.org

Seed funding for FathomNet was provided by the National Geographic Society (#518018), the National Oceanic and Atmospheric Administration (NA18OAR4170105), and MBARI with generous support from the David and Lucile Packard Foundation. Additional financial support was provided by the National Geographic Society (NGS-86951T-21) and the National Science Foundation (OTIC #1812535 & Convergence Accelerator #2137977).