Researchers design artificial skin to help robots “smell”

Researchers at the National University of Singapore said on Wednesday that they were carrying out work to give robots the sense of touch through artificial skin. The two researchers, who are also members of the Intel Neuromorphic Research Community (INRC), presented research that demonstrates the promise of event-based vision and tactile detection, combined with Intel’s neuromorphic processing. for robotics.

The majority of today’s robots operate solely on the basis of visual processing and lack the capabilities humans have when it comes to the sense of touch.

The researchers hope to change that by using their artificial skin, which the University of Singapore describes as being able to detect contacts more than 1,000 times faster than the human sensory nervous system. Artificial skin, according to academics, can also identify the shape, texture and hardness of objects “10 times faster than the blink of an eye.”

Bringing robotics closer to humans

Mike Davies, director of Intel’s Neuromorphic Computing Lab, said this research provides a glimpse into the future of robotics where information is both sensed and event-processed. “This work adds to a growing body of evidence showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is reconfigured to an event-based paradigm. , covering sensors, data formats, algorithms and hardware architecture, ”he said.

The University of Singapore said allowing a human-like sense of touch in robotics could significantly improve current functionality, giving the example of robotic arms fitted with artificial skin that could easily adapt to changes in goods made in a factory, using tactile sensing to identify and grasp unfamiliar objects with the right amount of pressure to prevent slipping.

“The ability to sense and perceive the environment better could also enable a closer and safer interaction between humans and robots, as in the care professions, or bring us closer to the automation of surgical tasks by giving surgical robots have the sense of touch that they lack today, ”the university said.

Researchers design artificial skin to help robots

Intel Neuromorphic Chips

Intel is helping researchers by providing a chip that is deployed inside the robot to help draw precise conclusions from skin sensory data in real time. “Making a super-fast artificial skin sensor solves about half of the puzzle of making robots smarter,” says assistant professor Benjamin Tee of the Department of Materials Science and Engineering and the Institute of Engineering. health innovation from New York University.

“They also need an artificial brain that can ultimately realize perception and learning as another essential piece of the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi is a big step towards energy efficiency and scalability. “

Using Intel’s Loihi neuromorphic research chip in their first experiment, the researchers used a robotic hand-fitted with the artificial skin to read braille, transmitting the data to Loihi via the cloud to convert the micro-bumps felt by the hand in a “semantic meaning”. According to Intel, Loihi achieved over 92% accuracy in classifying Braille letters, while using 20 times less power than a standard Von Neumann processor.

Combination of visual and tactile data

Building on this work, the team of researchers further improved robotic perception capabilities by combining visual and tactile data with an advanced neural network. To do this, they tasked a robot to classify various opaque containers containing different amounts of liquid, using sensory inputs from artificial skin and an event-based camera. “Using the same touch and vision sensors, they also tested the ability of the perception system to identify rotational slip, which is important for a stable grip. “

The captured sensory data was then sent to a GPU and Loihi to compare processing capabilities. The researchers found that by combining event-based vision and touch using a cutting edge neural network, it resulted in 10% better accuracy in object classification compared to a visual-only system.

“We are thrilled with these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve the perception of robots. It is a step towards building efficient and reliable robots that can react quickly and appropriately in unexpected situations, ”added Assistant Professor Harold Soh, Department of Computer Science, NUS School of Computing.