Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Chris Oosthuizen, Research Fellow at the Centre for Statistics in Ecology, the Environment and Conservation (SEEC), University of Cape Town

Diving with penguins: tech gives ocean scientists a bird’s-eye view of foraging in Antarctic waters

Chinstrap penguins are members of Antarctica’s brush-tailed group of penguins. They’re easily identified by the feature that gives them their name – a black strap that runs from ear to ear below the chin. The species is found mostly in the Western Antarctic Peninsula region, on far-flung isles such as the South Shetland Islands, the South Sandwich Islands and the South Orkney Islands.

Chinstrap penguins are highly specialised predators, feeding on marine crustaceans called Antarctic krill. The birds are still very abundant (estimates suggest there are between 3 million and 4 million breeding pairs). But many of their colonies are unfortunately experiencing population declines. The trend may be linked to krill becoming less available because of climate change, increasing populations of other marine predators (like baleen whales, which also eat krill) and commercial krill fishing.

So, it’s important to understand how much krill chinstrap penguins and other marine predators are consuming. This can help scientists to predict future population trends and inform conservation and ecosystem management strategies.

It’s challenging to observe directly how penguins catch their underwater prey in areas of remote ocean habitat. However, thanks to innovations in technology that have allowed for ever more powerful remote monitoring, our understanding of their foraging behaviour has rapidly grown during the last decades.


Read more: Tracking technology gives new insights into the behaviour of migrating birds


We are part of a team of researchers that recently published a study underpinned by just such a technological innovation. Using animal-borne video and movement sensor data to train machine learning algorithms, we were able to quantify how much krill chinstrap penguins catch. We used “deep learning”, a subset of machine learning, to detect the penguins’ feeding events. In our study, these algorithms not only performed classification tasks faster than human observers would be able to, but also detected patterns in the data that were difficult to observe visually.

To date, estimates of krill consumption by penguins have typically been derived from bio-energetic models which are based on principles of physiology like metabolic rates and how energy is assimilated from food. These estimates often can’t be empirically validated. Another antiquated method, stomach flushing, is highly invasive.

Animal-borne sensors provide continuous, high-resolution data on movement and behaviours, allowing large amounts of data to be recorded. But all that data needs to be analysed, which is not an easy task for humans. Machine learning algorithms can rapidly process these large datasets.

Decoding penguin foraging

We camped at the South Orkney Islands in January 2022 and January 2023 to collect data for this study. We used waterproof tape to attach miniature video cameras and tags with acceleration and pressure sensors to the backs of penguins breeding on the islands. Each penguin collected data for a single foraging trip at sea, usually lasting less than a day. We removed the loggers when the penguins returned to their nests to feed their chicks.


Read more: New discovery: penguins vocalise under water when they hunt


To attach and to remove the devices, we caught the nesting penguins by hand, blindfolded them with a soft cloth hood and restrained them (in our hands) for a few minutes. The short handling times, and the small size of the tags, make any potential negative effects from this process unlikely.

A video captured by one of the chinstrap penguins as it dove for prey.

The video footage allowed us to visually confirm each instance where the penguins were catching krill. The other sensors measured the penguins’ diving depths and the dynamics of movement (acceleration in three axes – surge, sway and heave, which allows body pitch and roll to be identified – at 25 data points per second).

Because the video, acceleration and depth data were time-synchronised we could use the video observations to identify and label the snippets of acceleration and depth data that corresponded to prey captures. Machine learning models were then trained on the accelerometer data, using the labels as instances where prey captures occurred.

The results showed that the machine learning models we trained with labelled data are able to identify prey capture events from new acceleration and depth data with high accuracy.

What’s exciting is that the machine learning model can now work in the absence of video data, identifying prey capture events from new acceleration and depth data. In future, we can therefore use a single acceleration and depth bio-logging tag per bird to obtain information on prey captures in this species.

Obtaining foraging information without video is preferable for monitoring purposes, since video cameras only record a few hours before depleting their batteries, whereas acceleration and depth can be measured over many days.

Guiding conservation

Our hope is that the method we developed can be used to monitor temporal and spatial changes in how much chinstrap penguins eat, to help guide conservation and ecosystem management in the Southern Ocean around Antarctica.

This research was led by Stefan Schoombie and Chris Oosthuizen from the University of Cape Town in South Africa, in collaboration with experts based at the African Institute for Mathematical Sciences (Lorène Jeantet and Emmanuel Dufourq) and Nelson Mandela University (Pierre Pistorius) in South Africa, the Centre D’Études Biologiques de Chizé in France (Marianna Chimienti), La Trobe University in Australia (Grace Sutton) and the Norwegian Polar Institute in Norway (Andrew Lowther).

The Conversation

This study was funded by the Antarctic Wildlife Research Fund (project no. 18/2021 to Chris Oosthuizen) and the Research Council of Norway (project no. 310028 to Andrew Lowther).

Emmanuel Dufourq is funded by a grant from Carnegie Corporation of New York (provided through the AIMS Research and Innovation Centre).

Lorène Jeantet was funded by a grant from the International Development Research Centre, Ottawa, Canada, www.idrc.ca, and with financial support from the Government of Canada, provided through Global Affairs Canada (GAC), www.international.gc.ca. Both sources were provided through the African Institute for Mathematical Sciences.

Stefan Schoombie received funding from the National Institute for Theoretical and Computational Sciences (NITheCS).

Pierre Pistorius does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.