There are some questions in science that can only be answered by strapping a pair of 3D glasses to an unsuspecting cuttlefish and setting it loose in an underwater movie theatre.
That, at least, was the thinking of a team of researchers who set themselves the task of working out how the marine molluscs know how far away prey is before launching their explosive, tentacled attacks.
The puzzle has occupied an admittedly select group of scientists because of the cuttlefish’s unusual ability to scope out a 360-degree field of vision by moving its eyes independently.
Dr Trevor Wardill, who led the work at the University of Minnesota, said that given the complexities of cuttlefish vision, it was considered unlikely that the animals judged distance in the same way as humans. That process, known as stereopsis, computes distance by comparing how each eye sees objects in slightly different positions.
Wardill and his colleagues realised they could test whether cuttlefish use stereopsis by getting them to wear 3D glasses and playing them some juicy 3D shrimp movies. “A lot of people said it wasn’t going to work,” Wardill said. “They said they’d rip the glasses off. They said there’d be ink in the tank.”
But Wardill and his co-workers found a way. In experiments at the Marine Biological Laboratory in Woods Hole, Massachusetts, the researchers found that careful treatment, distractions, and a copious supply of shrimp were rewarded with cooperation. “You’ve got to get in the mind of the cuttlefish and make them happy,” Wardill explained.
Not that every step went smoothly. Attempts to glue the glasses directly on to the molluscs left some at jaunty angles and risked skin damage when the cuttlefish reached up with their arms – of which they have eight – to pull them off. That was solved with a superglued velcro strip that the glasses then attached to.
The glasses posed another hurdle, however. “The first ones that wrapped around caught too much water, so if the cuttlefish swam backwards, the glasses would fly off,” Wardill said.
But with tenacity, the scientists overcame the problems. The cuttlefish – whose names included Supersandy, Long Arms, Inky and Sylvester Stallone – were ready to be trained and tested.
In training sessions, the cuttlefish familiarised themselves with their new home: a tank with an underwater movie screen on which the scientists played moving images of shrimp, the molluscs’ favourite snack. Once the animals were used to the tank and stopped playing with their glasses, the researchers set a high-speed video camera recording and the experiments began.
The scientists arranged the equipment so that the cuttlefish saw shrimp moving along in slightly different positions with each eye. The same technique gives 3D movies their depth of field: the brain combines the images into one and uses triangulation to work out how close objects are.
When the researchers varied the spacing between the shrimp images – making them seem nearer or further away to a human wearing 3D glasses – the cuttlefish adjusted their striking distance before lunging at the virtual prey.
“If you have the images a long way apart, the cuttlefish think the shrimp is really close and they back up and try to shoot their tentacles right in front of them. But if you flip the images around and make the shrimp look like it’s behind the screen, they’ll swim right into it,” said Wardill. The study, published in Science Advances, concludes that cuttlefish use stereopsis after all, though further tests showed the animals must use different neural circuitry to do so.
“As neuroscientists, we’re looking to understand the principles underlying brains,” said Feord. “The process of stereopsis has cropped up multiple times throughout evolution, but each time the neural circuitry and its capabilities are a little different.”
Prof Jenny Read, a professor of vision science at Newcastle University, who last year discovered evidence for stereopsis in praying mantises, said the work proved there are different ways of achieving stereopsis: “Creatures like cuttlefish or mantises may seem outlandish, but understanding them will help us come up with varieties of machine vision which are most appropriate for different situations, say for a flying drone versus a robot vacuum cleaner versus a security camera. They are amazing examples of evolved engineering, and we have so much to learn from them.”