Direkt zum Inhalt springen
login.png Login    |
de | en
MyTUM-Portal
Technical University of Munich

Technical University of Munich

Sitemap > Media > Press releases > New senses for autonomous robots
up   Back to  News Board    previous   Browse in News  next    

Clever as a blind fish

New senses for autonomous robots

The Astyanax-fish - role model for the underwater robot "Snookie".

29.03.2010, Press releases

The underwater robot "Snookie" can orient itself in murky waters with an artificial sensory organ inspired by the so-called lateral-line system, found in fish and some amphibians. The experimental vehicle was developed by researchers at the Technische Universität München within the framework of the CoTeSys (Cognition for Technical Systems) excellence cluster. In the future, the researchers expect such capabilities to enable underwater robots to work autonomously in operations ranging from deep sea exploration to inspection of sewer pipes.

Conventional robots are tough: Hostile environments, toxic and corrosive gases, low light levels, moisture, dirt and disease mean nothing to them. Unlike humans, for whom such conditions are generally unbearable. However, these robots – the ones typically in use today – can only do their job provided that they are precisely programmed to take each step.

Autonomous robots, on the other hand, will in the future be able to react intelligently to their surroundings and perform their tasks largely independently. Rather than being rigidly programmed, they rely on their own sensory perceptions. This is after all the only way in which they can recognize the situation they are in and still fulfill their tasks. However, in harsh environments their senses often fail them, laid low by fumes, dust, water, or high temperatures. New senses are called for - perhaps even sensory organs of a kind that even humans lack.

A new research project undertaken by the CoTeSys (Cognition for Technical Systems) excellence cluster in Munich aims to develop the technology to master such new senses. Biophysicist Prof. Leo van Hemmen of the Technische Universität München (TUM) has high hopes that the animal kingdom will provide the means to allow robots to perceive their environment. Fish, scorpions, even frogs, for example, perceive things that remain hidden to human organs. Not only are they able to detect minute pressure differences and vibrations and recognize threats, they use these senses to form an exact picture of their surroundings, enabling them at any moment to decide, for example, how best to seize their prey or how to conceal themselves behind a defensive obstacle. Prof. van Hemmen and his colleagues are studying just how animals do this, researching the algorithms with which their brains record their environment and developing hardware and computer programs to allow robots to imitate them.

Fish and amphibians for example possess an organ, the lateral line, which is non-existent in land animals. With this sensory organ, which extends along the both sides of the body, they are able to perceive minute variations in pressure and current flow. As a result they are able, even in murky water, to form a very detailed picture of their immediate surroundings at a range of about the length of their body. They know where obstacles lie, where dangers lurk, and where their prey are to be found. Lateral lines are comprised of hundreds or even thousands of fine sensory hairs that are located in tiny ducts beneath the skin and that register even tiny changes in flow velocity. The African clawed frog Xenopus laevis for example distinguishes between edible and inedible insects on the basis of water-borne vibrations. In terms of precision, these sensors are comparable with the human inner ear, where hundreds of thousands of fine sensory hairs enable us to distinguish between sounds - from the sigh of the wind to a symphony.

However, the complicated part is not the sensor itself, but how the signals it sends are processed to create a complete picture of the surrounding area. Differences in pressure are much more difficult to accurately pin down than waves of light. We humans perceive the problem when a sound catches our attention – and our eyes automatically seek out the source of the sound to confirm the location. Scorpions, on the other hand, use tiny vibrations transmitted through the ground to find their prey, even in the dark of night: These arachnids have sensory hairs on their eight legs, and their brains analyze the tiniest differences in the timing of vibration waves in the sand to detect where their prey is located. Similar algorithms can be used to analyze the lateral line perceptions of fish.

A favorite example studied by the researchers in Munich is the blind Mexican cave fish Astyanax. As a cave-dweller it has no need of sight in the darkness, and as the fish matures its eyes degenerate. Nevertheless, it has no difficulty in navigating its pitch-black habitat, reacting flexibly to changes and adapting quickly to new environments. The fact that robots can learn to do so too is demonstrated by "Snookie," an underwater robot built by an interdisciplinary team of scientists and technical specialists headed by Prof. van Hemmen. “Snookie” – named after a species of perch with a distinctive lateral line – is a robot fish made of Plexiglas and aluminum, about 80 centimeters long and 30 centimeters in diameter, stuffed to the gills with an electronic control system and a power supply. Among its striking external features are six propeller gondolas that drive and position the robot, and a yellow hemispherical nose to which the sensors that guide the underwater vehicle are secured.

The TUM scientists intentionally chose an underwater vehicle to test their technology, as such vehicles face a very particular set of challenges not experienced by autonomous robots on land:

  • Visibility under water is often limited to just a few centimeters. The infrared detectors commonly used by land robots alongside cameras to identify their surroundings do not work under water.
  • Wireless communication is restricted under water due to poor propagation.
  • Energy supplies are limited to the capacity of the batteries, so all systems must operate with extreme efficiency.
  • Maximum reliability is also essential, because if something goes wrong, an underwater robot can quickly be lost for ever.

“An underwater robot is as much on its own as a vehicle on Mars,” says electrical engineer Stefan Sosnowski. He works in the Department of Robotics headed by Professor Sandra Hirche and is responsible for the design of the underwater craft. His colleague, biophysicist Dr. Jan-Moritz Franosch, aided by a group of students, has developed an artificial lateral line for the robot, enabling “Snookie” to detect obstacles and movements in the water a hand’s breadth in front of its nose and on either side. This artificial organ measures changes in pressure and flow around the robot not with conventional dynamic indicators, which would be far too large and imprecise, but with thermistors. When a change in flow velocity occurs, this immediately causes a change in the heat dispersed through a heated wire. This in turn can be measured electronically by the sensor elements with great speed, and in a minimum of space. At intervals of a tenth of a second and using only a tiny amount of electrical energy, the sensors register pressure fluctuations of less than one percent over an area of just a few square millimeters.

The two young scientists look on "Snookie" as more than just an experiment. They expect autonomous underwater robots to find a broad range of applications - from investigating shipwrecks to carrying out deep-sea search missions, for example to locate the flight recorder after air disasters. More mundanely, they could also be used to inspect tanks and sewer pipes. Prof. van Hemmen also expects that robots with even more sensitive lateral line systems will have considerable potential uses on land, as it is of course equally possible to detect variations in pressure and flow in air, as well as water. Another external project is working on this subject. Man-made lateral lines might for example offer a cheaper alternative to the laser scanners currently used by robots to feel their way about their immediate surroundings – with the advantage that, unlike laser scanners, lateral lines won’t be blinded by other robots. This would allow autonomous robots to be deployed in swarms, opening the way for entirely new applications.

Biophysicist Prof. van Hemmen has more on his mind than just autonomous underwater robots. His goal is to develop and combine new forms of technological sensory perception, as he is convinced that in this way machines can perceive their environment with much greater accuracy. “The key here is 'multimodal sensing',” he explains. “Humans, too, don’t rely on a single sense. Our brains combine the input from a variety of senses to create an overall image of our surroundings. It is not until one of our senses fails us that we appreciate how important this combination is.” Prof. van Hemmen graphically demonstrates this using the following example: “It normally takes maybe ten seconds to strike a match. But if you put on thin gloves to take away the sense of touch, it becomes much harder. Often the task then takes more than a minute.”

The professor is also convinced that robot intelligence benefits little from installing even more cameras to supply even more images. He believes that it is more important for robots to perceive different aspects of their environment with a variety of sensors. However, when it comes to combining these different perceptions, he has to delve deep into the secrets of brain research: How do animals sift through a mass of data to filter out what is really relevant? How do humans manage this? The CoTeSys excellence cluster, he believes, presents an opportunity not just to answer these questions, but, through interdisciplinary cooperation among physiologists, information technologists and engineers, to transfer the new-found principles to the world of technology: “To be alert means reducing data to its essentials. Robots must learn to do this too, even when faced with a wide variety of sensor information."

In fact, CoTeSys specializes in just this kind of interdisciplinary cooperation. The research cluster brings together around 100 scientists working in widely differing fields at five universities and research institutes in the Munich area, in the interests of developing better cognitive capabilities for technical systems. The goal is to make robots more self-sufficient, able to analyze for themselves and flexibly respond to the situations in which they find themselves - from recognizing their surroundings through to independently performing their allotted tasks. As part of the Excellence Initiative, the Federal and state governments have set aside a total of 28 million euros in funding for the joint project coordinated by the Technische Universität München (TUM).

Gratis photographic material

http://www.cotesys.de/media/pictures-for-press.html
Copyright to all images is held by the Institut für Theoretische Biophysik/TUM

Further information available from:

CoTeSys: www.cotesys.org
Theoretical Biophysics at the TU Munich: http://www.t35.ph.tum.de 

For inquiries please contact:

Dr. Uwe Haass, Geschäftsführer CoTeSys CCRL – CoTeSys Central Robotics Laboratory 
Technische Universität München 
Barer Straße 21  
80290 München

CoTeSys – Cognition for Technical Systems 
Tel. +49 89 289 25 723, Fax +49 89 289 25 724 
E-Mail: gst@cotesys.org

Kontakt: presse@tum.de

More Information

100329_Fisch_UBoot2.pdf Druckversion der Pressemitteilung, (Type: application/pdf, Size: 82.6 kB) Save attachment
100329_Fisch_UBoot_EN.pdf Print version of this press release, (Type: application/pdf, Size: 80.5 kB) Save attachment

Corporate Communications Center

Media Relations Team
Arcisstr. 19
80333 München

Tel.: +49.89.289.22778
Fax: +49.89.289.23388

 presse@tum.de

Contact