“Automatic detection and counting of fish saves us a lot of time and tedious manual work,” says researcher Vaneeda Allken at the Institute of Marine Research (IMR).
With the help of a camera system that photographs the trawl catch during fishing, scientists can get a lot of information that they would not otherwise get. Among other things, they can see where in the water column or trawl line the various species of fish are caught in the trawl.
But if a human being were to go through all the pictures and count all the fish manually, it would require an enormous amount of work.
Allken and her colleagues have now tried to solve this challenge with the help of so-called deep learning.
Deep learning means that a computer “learns” to recognize patterns in large amounts of data through trial and error. This is a central method in artificial intelligence.
The researchers used a set of thousands of trawl images, both real and artificially composed, to train the artificial intelligence.
The task was to identify the fish in the pictures as either blue whiting, herring, mackerel, a mesopelagic fish or a mixture of different species. (As of today, the system is unable to distinguish between the mesopelagic species Mueller’s pearlside and spotted lanternfish.)
After the “training session”, the machine was tested on a new and unknown image set from a trawl catch. It managed to determine the species of the catch with a precision of about 85 percent.
The machine was also able to estimate how many fish were in the catch in total.
The results, recently published in the ICES Journal of Marine Science, are a step in the development of new and effective methods in marine research and fisheries management based on digital technology.
Getting images from the trawl, automatically interpreted using artificial intelligence, could reduce the need for physical samples from the ecosystem.
“This makes it possible to envision a future where we could use open-ended trawls and extract the information we need from images rather than by catching fish”, says researcher Vaneeda Allken.
For both researchers and fishermen, technology can make it easier to sort fish of unwanted species or size out of the catch.
“This would lead to less by-catch, which is good for the ecosystem,” says Allken.
Researcher Shale Pettit Rosen, who also contributed to the study, highlights that the research is part of a larger investigation at the innovation center CRIMAC (see fact box). The center aims to improve and automate the interpretation of data from echo sounders (acoustics) on research vessels and fishing boats.
“Here, catch data is needed to classify what you see with tomorrow’s broadband echo sounders,” Rosen says.
“Such echo sounders can be used to conduct more sustainable fishing by fishing for and catching the right species, and improved fisheries management with a more complete understanding of the marine ecosystem.”
Although artificial intelligence relieves marine scientists of some manual labor, cameras and computers cannot replace rubber gloves and oil clothes.
“Physical tests are still needed to get information about diet, growth and sex for most fish species,” Rosen says.
The researcher also points out that artificial intelligence has its challenges, including encounters with rare species and other unexpected issues that appear in the trawl.
“Artificially intelligent systems are often trained with the species composition that you expect to get in the catch. But unexpected objects, from alien fish species to marine waste, can be important to identify correctly in order to get a better overview of the ecosystem and changes in the environment,” says Rosen.
“We will still need humans at different stages to make sure the algorithms are working properly and the predictions are not off, especially when deployed in a new context,” says colleague Vaneeda Allken.
Allken, Vaneeda, Shale Rosen, Nils Olav Handegard and Ketil Malde. “A deep learning-based method to identify and count pelagic and mesopelagic fishes from trawl camera images.” ICES Journal of Marine Science 78, 10 (2021). Link: https://doi.org/10.1093/icesjms/fsab227