Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Lifestyle
Kim Munro, Lecturer, University of South Australia

A new exhibition explores invisible data, from facial algorithms to satellite tracking as a return to Country

Topbunk

Review: Invisibility, MOD museum, Adelaide

Disinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.

In her New York Times bestseller, Weapons of Math Destruction (2016), subtitled “how big data increases inequality and threatens democracy”, mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday lives and how accurate, fair or biased they might be.

Algorithms hide behind the assumed objectivity of maths, but they very much contain the biases, subjective decisions and cultural frameworks of those who design them. With scant detail on how these algorithms are created, O’Neil describes them as “inscrutable black boxes”.

Opaqueness is intentional.

In one of the upstairs galleries at the spacious MOD, we are greeted in large text as we enter: “what do algorithms think about you?”

Can an algorithm think?, we ask. And, if so, what informs the decisions it makes about us?

Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.

A man looks at his reflection.
Can an algorithm tell you who you really are? Topbunk

We don’t see who the photos are of or who is doing the evaluating – and therefore we don’t know what biases might be reproduced.

You are invited to gaze into a mirror which scans your face. From this scan, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.

When I look into the mirror, I am told I am neither trustworthy nor emotionally stable. The algorithm underneath under-guesses my age by a few years, and I score highly for intelligence and uncertainty – an unhelpful combination.

Despite my doubts about the algorithm, I notice myself focusing on the more favourable data.

In this context, the data is benign. But facial recognition technology has been used to survey and monitor activists and has been responsible for thousands of inaccurate identifications by police in the UK.


Read more: Algorithms can decide your marks, your work prospects and your financial security. How do you know they're fair?


Using data to illuminate cultural knowledge

In one of the more impressive works in the exhibition, contemporary data visualisation is used to illustrate Aboriginal forms of knowing and the intrinsic relationship between spatial awareness, Country and kinship.

Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla from design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.

In every AFL game Goodes played, his on-field movements was recorded via satellites, which connected with a tracking device in the back of his jumper. 20 million data points were then fused with data scans of a Red River Gum, or Wirra, to form an impressive data visualisation projected onto two large screens in a darkened gallery.

A large screen with swirling earthy colours.
In Ngapulara Ngarngarnyi Wirra (Our Family Tree), data from Adam Goodes’ football games is returned to Country. Topbunk

Here, Goodes’s data is returned to Country to form part of the roots of the tree as well as the swirling North and South Winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.

In a small room between the screens – or within the tree – drone footage of the Adnyamathanha Country (Flinders Ranges) plays against the retelling of the creation story in Adnyamathanha language.

What results is the synthesis of traditional Aboriginal knowledge with cutting edge technology, revealing different ways of sensing space and time.


Read more: The land we play on: equality doesn’t mean justice


The power of the invisible

While it’s easy to focus on how technology is used and exposed in the works in Invisibility, down the corridors and hanging from the ceiling in MOD are a few other exhibits that flesh out the concept of invisibility.

Black and white portraits
Women’s Work recognises the leadership of Indigenous women. Topbunk

Women’s Work celebrates the leadership of South Australian Aboriginal Women with striking black and white photographs. Tucked away down the hall on the second level is Fostering Ties, a series of images drawing attention to children in foster care.

This exhibition foregrounds invisibility as a way to contend with our own blind-spots, knowledge systems, biases and cultural frameworks.

What is invisible to us may not be to those from demographics, cultural or language groups that differ from ours.

Drawing attention to the invisible encourages us to shift our perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or life form – does.

Invisiblity is at MOD until November 2022.

The Conversation

Kim Munro does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.