Did you ever wonder how these tiny insects are able to reliably navigate through cluttered landscapes without bumping into obstacles? Or did you ever ask yourself how they can walk or fly hundreds of meters and find back to their nest, which might be a tiny hole in the ground? If you are interested in insect navigation you have found the right webpage because we share the same interests!
This web facility provides access to panoramic views at any position in a 3D model of a natural environment. More environment models will be added as they become available. These views can be used to test flight control and navigation algorithms, to map the navigational information content of natural environments and to reconstruct views from the viewpoint of navigating animals.
We are interested in reconstructing what freely behaving animals see in real life. We are using laserscanner- and camera-based methods to construct 3D models of natural environments, including their detailed textures. This now allows us to reconstruct the natural visual input (at least as far as luminance and texture is concerned) experienced by freely flying and walking insects performing tasks such as homing under complex natural conditions. Furthermore, these 3D models can be used by the robotics community to test the performance of flight-control, navigation and homing algorithms in natural benchmark environments.
The service also allows users to render the world through the sampling array of a honeybee compound eye!
The service is still in beta phase! Please do not hesitate to contact us for any kind of feedback.
For access to the view render engine please click here.
Why is the Institute of Robotics and Mechatronics interested in insect navigation?
While insects reliably navigate in complex environments, navigation by state-of-the-art mobile robots is still slow and error prone. Our aim is not to mimic insects, but to be inspired by their navigation strategies that are arguably very robust, flexible and efficient. In the Mobile Robots group we focus on the autonomy and cooperation of robotic platforms and are interested in deriving efficient navigation concepts by analysing insect behaviour and through discussions with biologists. Implementing insect-inspired algorithms on technical platforms also allows us to test models of route-following and homing proposed by biologists. We thus hope to serve complementary interests: Biologists attempt to identify the cues and mechanisms underlying the navigational competence of animals, while roboticists are interested in achieving in their navigation algorithms the same efficiency and reliability that we observe in biological systems.