Robot manufacturers have long wanted one more way for robots to receive information from the environment and have been working to develop electronic-skin (e-skin) that can act like human skin. Of course, this requires a comprehensive and highly sensitive electronic device. This device should transmit information in the blink of an eye. Engineers from the University of Glasgow think they can create just that.
Computational e-skin prototype that enables robots to record pain is introduced in the journal Science Robotics. It is reported that this technology is possible to produce touch-sensitive robots and give prosthetic limbs a human sensitivity close to touch.
Previous attempts to develop touch-sensitive robots have run into the problem that processors get clogged when dispersed sensors send large amounts of data. So it took a minute or so for a computer to translate the data into something meaningful.
The human nervous system, which begins processing the senses at the point of touch and sends only the really important things to the brain, inspires this new design. A similar approach in robotics seems to free up communication channels and save the computer from getting bogged down with excessive amounts of sensory data.
A grid of 168 synaptic transistors forms this information processing tool, consisting of zinc oxide nanowires that can be spread over a flexible surface. This grid was placed on a human-shaped “hand” and it was possible for the hand to distinguish between light and heavy touch with skin sensors.
Making a robot feel pain may sound pointless, but the goal is to improve sensitivity in a way that is beneficial to trial and error learning. It may be possible for robots trying to learn from external stimuli to experience the process that children go through as “touching a hot iron is bad”.