The locust ear inside the chip.
The locust ear inside the chip.
Source: Tel Aviv University

An ear-bot 'hears' through the ear of a locust

Tel Aviv University researchers have opened the door to sensory integrations between robots and insects: for the first time, the ear of a dead locust was connected to a robot that receives the ear’s electrical signals and responds accordingly. The result is extraordinary: When the researchers clap once, the locust's ear hears the sound and the robot moves forward; when the researchers clap twice, the robot moves backwards.

In general, biological systems have a huge advantage over technological systems - both in terms of sensitivity and in terms of energy consumption. This initiative of Tel Aviv University researchers may in the future make much more cumbersome and expensive developments in the field of robotics redundant.

The interdisciplinary study was led by Idan Fishel, a joint master student under the joint supervision of Dr. Ben M. Maoz of The Iby and Aladar Fleischman Faculty of Engineering and the Sagol School of Neuroscience, Prof. Yossi Yovel and Prof. Amir Ayali, experts from the School of Zoology and the Sagol School of Neuroscience together with Dr. Anton Sheinin, Yoni Amit, and Neta Shavil.

The researchers explain that at the beginning of the study, they sought to examine how the advantages of biological systems could be integrated into technological systems, and how the sensory organs of a dead locust could be used as sensors for a robot. “We chose the sense of hearing, because it can be easily compared to existing technologies, in contrast to the sense of smell, for example, where the challenge is much greater,” says Dr. Maoz. “Our task was to replace the robot's electronic microphone with a dead insect's ear, use the ear’s ability to detect the electrical signals from the environment, in this case vibrations in the air, and, using a special chip, convert the insect input to that of the robot.”

To carry out this unique and unconventional task, the interdisciplinary team (Maoz, Yovel and Ayali) first built a robot capable of responding to signals it receives from the environment. Subsequently, the researchers were able to isolate and characterize the dead locust ear and keep it functional long enough to successfully connect it to the robot. In the final stage, the team succeeded in finding a way to pick up the signals received by the locust’s ear in a way that could be received and responded to by the robot.

“Prof. Ayali’s laboratory has extensive experience working with locusts, and they have developed the skills to isolate and characterize the ear,” explains Dr. Maoz. “Prof. Yovel's laboratory built the robot and developed code that enables the robot to respond to electrical auditory signals. And my laboratory has developed a special device - Ear-on-a-Chip - that allows the ear to be kept alive throughout the experiment by supplying oxygen and food to the organ, while allowing the electrical signals to be taken out of the locust’s ear and amplified and transmitted to the robot. 

Biological systems expend negligible energy compared to electronic systems. They are miniature, and therefore also extremely economical and efficient. For the sake of comparison, a laptop consumes about 100 watts per hour, while the human brain consumes about 20 watts a day.

In addition, “Nature is much more advanced than we are, so we should use it,” urges Dr. Maoz. “The principle we have demonstrated can be used and applied to other senses, such as smell, sight and touch. For example, some animals have amazing abilities to detect explosives or drugs; the creation of a robot with a biological nose could help us preserve human life and identify criminals in a way that is not possible today. Some animals know how to detect diseases. Others can sense earthquakes. The sky is the limit.”

The research was published in the journal Sensors.

Subscribe to our newsletter

Related articles

Most patients wouldn't mind a 'robotic doc'

Most patients wouldn't mind a 'robotic doc'

A study finds patients are receptive to interacting with robots designed to evaluate symptoms in a contact-free way.

Artificial skin brings robots closer to 'touching' human lives

Artificial skin brings robots closer to 'touching' human lives

Researchers have constructed a 3D vision-guided artificial skin that enables tactile sensing with high performance, opening doors to innumerable applications in medicine.

Electronic skin – the next generation of wearables

Electronic skin – the next generation of wearables

Electronic skins will play a significant role in monitoring, personalized medicine, prosthetics, and robotics.

Quantum sensors for next-gen brain-computer interfaces

Quantum sensors for next-gen brain-computer interfaces

Recently, Professor Surjo R. Soekadar outlined current and upcoming applications of brain-computer interfaces.

Exosuit helps with awkward lifts

Exosuit helps with awkward lifts

In the last few years, mechanically assistive exosuits have started to see commercial deployment.

An affordable self-navigating smart cane

An affordable self-navigating smart cane

The cane incorporaties sensing and way-finding approaches from robotics and self-driving vehicles.

Breakthrough: First hybrid sensing approach

Breakthrough: First hybrid sensing approach

Texas engineers innovated a first-ever hybrid sensing approach that allows the device to possess properties of the two predominant types of sensors in use today.

How robots can tell how clean is ‘clean’

How robots can tell how clean is ‘clean’

A sensor for autonomous cleaning robots can quantify the cleanliness of a given area.

3D printed origami technology to fight Covid-19

3D printed origami technology to fight Covid-19

Researchers are replicating the subtle folding of origami to create 3D printable technologies to aid in the fight against COVID-19.

Popular articles

Subscribe to Newsletter