Flies get a go on flight simulator

Posted by Emma Woollacott

Munich, Germany - Brain researchers have built a flight simulator for flies. While it may be great fun for the little blighters (who knows?), the real aim is to improve the image processing capabilities of robots.
    
Scientists from the Max-Planck Institute for Neurobiology have created a wraparound display to present diverse patterns, movements, and sensory stimuli to blowflies. The insect is held in place by a teeny-tiny halter, so that electrodes can register the reactions of its brain cells.


The first results show one thing very clearly: the way flies process the images from their immobile eyes is completely different from they way the human brain processes visual signals.


Movements in space produce so-called "optical flux fields" that characterize specific kinds of motion definitively. In forward motion, for example, objects rush past on the sides, and foreground objects appear to get bigger. Near and distant objects appear to move differently.


The first step for the fly is to construct a model of these movements in its tiny brain. The speed and direction with which objects before the fly's eyes appear to move generate, moment by moment, a typical pattern of motion vectors, the flux field, which in a second step is assessed by the so-called "lobula plate," a higher level of the brain's vision center. In each hemisphere there are only 60 nerve cells responsible for this; each reacts with particular intensity when presented with the pattern appropriate to it.


Flux fieds are presented to the fly


For the analysis of the optical flux fields, it's important that motion information from both eyes be brought together. This happens over a direct connection of specialized neurons called VS cells. In this way, the fly gets a precise fix on its position and movement.


According to project leader professor Alexander Borst, his results are particularly interesting to the engineers associated with the academic chair for guidance and control at the Technischen Universität München (TUM), with whom Professor Borst collaborates.


The TUM researchers are working to develop intelligent machines that can observe their environment through cameras, learn from what they see, and react appropriately to the current situation.


For example, they're developing small, flying robots whose position and movement in flight will be controlled by a computer system for visual analysis inspired by the example of the fly's brain. One mobile robot, the Autonomous City Explorer (ACE) was challenged to find its way from the institute to Marienplatz at the heart of Munich – a distance of about a mile – by stopping passers-by and asking for directions. To do this, ACE had to interpret the gestures of people who pointed the way and negotiate the sidewalks and traffic crossings safely.