Could We Use Facial Recognition for Whales?
Have you ever been asked if you would like to tag someone in your photo on Facebook? It is the ingenious artificial intelligence that detects faces on your photos. Could it be used to identify whales?
Facial recognition process uses Convolutional Neural Networks (CNN): a type of human brain-inspired computational model that consists of several layers that filter inputs to get to the desired information. CNNs form the basis to Apple’s Siri, Amazon’s Alexa and other voice recognizing virtual assistances as well.
North Atlantic right whales are an endangered species, and researchers use photo-identification as a way to study them. Photo-identification involves matching patterns of callosities on their heads. It can be a tedious and time-consuming process to identify individual whales creating backlogs for conservation organizations with thousands of photos to match. But there must be a more efficient way…
What if you could find a match in real-time?
Deepsense.ai is a Poland-based data science company that came up with a fully automated solution that results in lightning fast photo-identification of right whales. Once launched, users will be able to simply upload their photos and receive results within tens of seconds with over 87% accuracy—the highest rate yet of all available tools. The rate increases to 95% if the algorithm gets to generate its top five matches instead of just one.
Deepsense’s solution acts as a facial recognition tool for right whales. It does so by using three different CNNs:
- CNN 1: Identifies the region of interest (in this case: a right whale) in the photo and draws a “bounding box” around it (rostrum of the whale, in this case). It does so by analyzing each pixel of the image and finding the edges of the region of interest.
- CNN 2: Creates a standardized “passport photo” by cropping and rotating the originals so only the area from the bonnet to the blowhole of the right whale remains. Identifies the callosities on the whale’s head.
- CNN 3: Based on the pattern of the callosities, this layer finds a match from its catalogue of individuals and outputs its results instantaneously.
The unique amalgamation of expertise from data scientists and North Atlantic right whale researchers began after Deepsense won a Kaggle challenge that the National Oceanic and Atmospheric Administration (NOAA) had hosted in August 2015. Since then, they have been working on turning the algorithm into user-friendly software.
When Deepsense started working on their initial project, the team faced several challenges that accompanied right whale aerial photos. These included: varying qualities of photos, glare, white caps, overexposed and underexposed photos, and white water around a whale. Fieldwork conditions are rarely ideal and researchers needed a solution that could accommodate photos from a range of situations. It is crucial to successfully map out the pattern from the aerial photos for accurate photo-identification, despite the challenges.
Deep learning is at the core of Deepsense’s winning solution. This means after initial manual inputs by the team, the networks learnt the steps and became fully automated. As a result, their versatile solution can compensate for colour and texture of aerial photos while yielding results within the same time limits.
Why is it better than conventional methods?
Other platforms, such a D.I.G.I.T.S. and Flukebook, require manual inputs at various steps. D.I.G.I.T.S. is a search-based platform where users can filter photos from the Anderson Cabot Center for Ocean Life’s catalogue based on attributes, such as scars and sex. Flukebook requires its users to submit photos, which get approved by a local researcher before being processed by their software—resulting in delays.
With only 411 remaining right whales and conservation efforts at their highest, Deepsense’s solution will allow scientists to speed up the photo-identification process—often a foundation of many projects on wild populations. It will result in an increased efficiency in not only the photo-identification analysis but also in cross-matching individual whales between existing catalogues. This will be especially resourceful for migratory whales, such as the North Atlantic right whales. With future advancements, this system could allow matching of individuals from aerial photos and videos and those taken from submersibles, or camera traps. The possibilities are endless.
Launching Next Year
In an email, Christin B. Khan, the Fishery Biologist from NOAA revealed that the software could launch as early as next summer as part of the existing Flukebook, in collaboration with Wild Me. Additionally, the team would like to extend the software’s application to the closely related Southern right whales by making it available to researchers in Australia, South Africa, and other countries with right whale conservation efforts.
(2018) Bogucki R., Cygan M., Khan C.B., Klimek M., Milczek J.K., Mucha M. Applying deep learning to right whale photo identification. Conservation Biology, 00(00): 1-9
After spending the summer with whales off the west coast of Canada, Jasspreet Sahib is delighted to join the GREMM team as a Writing Intern this fall, through the Canadian Conservation Corps program. With a background in Marine Biology and Journalism Studies from Dalhousie University, she loves to share her passion for marine mammals and science communication with the readers of Whales Online.