Robots imitate honeybees for aircraft aerobatics
By allowing aircraft to quickly sense which way is “up” by imitating how honeybees see, engineers and researchers at The Vision Centre, Queensland Brain Institute and the School of Information Technology and Electrical Engineering at The University of Queensland, have enabled planes to guide themselves through extreme maneuvers, including the loop, the barrel roll and the Immelmann turn, with speed, deftness and precision.
“Current aircraft use gyroscopes to work out their orientation, but they are not always reliable, as the errors accumulate over long distances”, said Vision Centre researcher Saul Thurrowgood. “Our system, which takes 1000ths of a second to directly measure the position of the horizon, is much faster at calculating position, and more accurate.”
The group first “trained” the system to recognize the sky and the ground by feeding hundreds of different landscape images to it and teaching to it compare the blue color of the sky with red-green colors of the ground. Simple, low resolution cameras that are similar to a bee’s visual system are then attached to the aircraft, allowing the plane to take its own landscape pictures to identify the horizon while flying.
“The measurement process can certainly be quickened – we only have to adjust the cameras to take images with a smaller resolution”, said Thurrowgood. “However, it won’t produce the same quality of data, so the key is to find an optimal resolution where you have both speed and quality.”
“We have created an autopilot that overcomes the errors generated from gyroscopes by imitating a biological system – the honeybees”, said Professor Mandyam Srinivasan, a fellow author of their paper “UAV attitude control using the visual horizon” which was presented at the Eleventh Australasian Conference on Robotics and Automation. “Although we don’t fully understand how these insects work, we know that they are good at stabilizing themselves while making complicated flight maneuvers by watching the horizon. This project required tremendous effort, as separating the sky from the ground visually is not always as easy as we imagine – it can be difficult to pick out the horizon, so my hat’s off to Mr Thurrowgood for achieving this.”
The researchers claim that the system can potentially be adapted for all types of aircraft – including military, sporting and commercial planes. However, there are various situations when this system couldn’t be applied due to weather conditions, such as clouds with heavy rain, fog or night flights when the current algorithms can’t be as reliable as in their tests. On the other hand, development of visual navigation algorhytms could lead to the future of navigation in zero gravity environments.
Leave your response!