By sensing how rapidly their destination ‘zooms in’ as they fly towards it, honeybees can control their flight speed in time for a perfect touchdown without needing to know how fast they’re flying or how far away the destination is.
This discovery may advance the design of cheaper, lighter robot aircraft that only need a video camera to land safely on surfaces of any orientation, says Professor Mandyam Srinivasan of The Vision Centre (VC) and The University of Queensland Brain Research Institute.
“Orchestrating a safe landing is one of the greatest challenges for flying animals and airborne vehicles,” says Prof. Srinivasan. “To achieve a smooth landing, it’s essential to slow down in time for the speed to be close to zero at the time of touchdown.”
Humans can find out their distance from an object using stereovision – because their two eyes, which are separated by about 65 mm, capture different views of the object. However, insects can’t do the same thing because they have close-set eyes, Prof. Srinivasan explains.
“So in order to land on the ground, they use their eyes to sense the speed of the image of the ground beneath them,” he says. “By keeping the speed of this image constant, they slow down automatically as they approach the ground, stopping just in time for touchdown.
“However, in the natural world, bees would only occasionally land on flat, horizontal surfaces. So it’s important to know how they land on rough terrain, ridges, vertical surfaces or flowers with the same delicacy and grace.”
In the study, the VC researchers trained honeybees to land on discs that were placed vertically, and filmed them using high speed video cameras.
“The boards carried spiral patterns that could be rotated at various speeds by a motor,” says Prof. Srinivasan. “When we spun the spiral to make it appear to expand, the bees ‘hit the brakes’ because they thought they were approaching the board much faster than they really were.
“When we spun the spiral the other way to make it appear to contract, the bees sped up, sometimes crashing into the disc. This shows that landing bees keep track of how rapidly the image ‘zooms in’, and they adjust their flight speed to keep this ‘zooming rate’ constant.”
“Imagine you’re in space and you don’t know how far away you are from a star,” Prof. Srinivasan says. “As you fly towards it, the other stars ‘move away’ and it becomes the focus. Then when the star starts to ‘zoom in’ faster than the regular rate, you’ll slow down to keep the ‘zooming rate’ constant.
“It’s the same for bees – when they’re about to reach a flower, the image of the flower will expand faster than usual. This causes them to slow down more and more as they get closer, eventually stopping when they reach it.”
The VC researchers also developed a mathematical model for guiding landings, based on the bees’ landing strategy. Prof. Srinivasan says unlike all current engineering-based methods, this visually guided technique does not require knowledge about the distance to the surface or the speed at which the surface is approached.
“The problem with current robot aircraft technology is they need to use radars or sonar or laser beams to work out how far the surface is,” Prof. Srinivasan says. “Not only is the equipment expensive and cumbersome, using active radiation can also give the aircraft away.
“On the other hand, this vision-based system only requires a simple video camera that can be found in smartphones. The camera, by ‘seeing’ how rapidly the image expands, allows the aircraft to land smoothly and undetected on a wide range of surfaces with the precision of a honeybee.”