To most people, leaves are green and oranges are orange. But if our pets could speak, they’d disagree.
We know there are many different ways to ‘see’ the world because that’s the diversity we have found in animals. Organisms with the ability to see have two or more eyes that capture light reflected by different surfaces in their surroundings and turn it into visual cues. But while all eyes have this common purpose, the specialised cells that respond to the light, called photoreceptors, are unique to each animal.
For instance, human eyes can only detect wavelengths of light between 380 and 700 nanometres (nm); this is the visible range. Honey bees and many birds on the other hand can also ‘see’ ultraviolet light (10-400 nm).
While the human visual range is relatively limited, it hasn’t abated humans’ curiosity about how animals see the world.
Thankfully we don’t have to imagine too much. Researchers at the University of Sussex and the George Mason University (GMU) in the U.S. have put together a new camera with the ability to view the world like animals do. In a paper published in PLoS Biology, the team has written their device can even reveal what colours different animals see in motion, which hasn’t been possible so far.
Making the invisible visible
Animals use colours to intimidate their predators, entice mates or conceal themselves. Detecting variations in colours is thus essential to an animal’s survival. Animals have evolved to develop highly sensitive photoreceptors that can detect light of ultraviolet and infrared wavelengths; many even notice polarised light as part of their Umwelt – the biological systems that make a specific system of meaning-making and communication possible.
Neither human eyes nor most commercial cameras have been able to tap into this unchartered territory of animal vision. In the new study, exponents of biology, computer vision, and programming came together to create a tool that could record and track the complexity of animal visual signalling.
The tool combined existing multispectral photography techniques with a new camera setup and a beam-splitter (to separate ultraviolet and visible light), all encased in a custom 3D-printed unit. The system recorded videos simultaneously in visible and ultraviolet channels in natural lighting. They fed the camera output through some code (written in Python) that could convert the visual data to the physical signals produced by photoreceptor cells.
Finally, the researchers modified these signals based on what they already knew about how an animal’s photoreceptors work, and produced videos true to what that animal might see. These used false colours in these videos so that, for example, a particular colour could stand in to show ultraviolet imagery.
In sum, the camera system translated what animals see in visible and non-visible light into colours compatible with the human eye.
The time challenge
You may have already seen false-colour images – like when you saw the Hubble space telescope’s iconic snap of the ‘Pillars of Creation’. The stars and nebulae don’t actually look that resplendent to human eyes. They are coloured that way to show what the telescope saw in, say, infrared or radio wavelengths. Scientists have also used false-colour images to understand how flowers reflect ultraviolet light to influence the behaviour of insects nearby.
But false colours can only stand in for so much. According to the researchers, existing techniques to visualise the colours animals see require object-reflected light to predict how an animal’s photoreceptor would respond or require a series of photographs in wavelengths beyond human vision (with the help of bandpass optical filters). Both scenarios require the subject to be motionless. The new system can visualise free-living organisms in their natural settings, however.
In addition, Pavan Kumar Reddy Katta, a graduate teaching assistant at GMU and one of the study’s authors, said the team wrote a program that could accept both ultraviolet- and visible-light data and spit out complete videos. “We made use of a continuous stream which allowed us to resolve our data at various points of space and time and produce real-time visualisations in animal-vision,” he told this author.
The next big thing in animal vision
Equipped with the new camera, the research team checked what the flower black-eyed Susan (Rudbeckia hirta) looks like to honey bees (Apis mellifera).
“To our eye, the black-eyed Susan appears entirely yellow because in the human-visible range, it reflects primarily long wavelength light,” the team wrote in its paper. “Whereas in the bee false colour image, the distal petals appear magenta because they also reflect ultraviolet, stimulating both the ultraviolet-sensitive photoreceptors … and those sensitive to green light … By contrast, the central portion of the petals does not reflect ultraviolet and therefore appears red.”
According to the paper, the visual mechanisms animals have evolved to communicate and protect themselves could help solve many of our detection problems. For example, the animal-vision video could help people navigate wild landscapes better and without hurting camouflaged animals. It can help farmers spot fruit pests that are not visible to the human eye but are readily visible to animals that have evolved to eat those fruits.
Daniel Hanley, assistant professor at GMU and the study’s corresponding author, said their invention could even transform the way wildlife documentary films are made. The camera system could allow filmmakers and ecologists to record the animal world through a new lens and create new visual experiences. He also said the platform’s striking images could be used to communicate the science of the living world to young audiences.
“We are thinking of creating a science exhibit for children using our setup, flowers, and live animals,” Dr. Hanley said. “Where children can just click a button to experience what a snake might see or a honeybee might see.”
Sanjukta Mondal is a chemist-turned-science-writer with experience in writing popular science articles and scripts for STEM YouTube channels.