New Technology Allows Robots to Feel Touch Without Artificial Skin

New Technology Allows Robots to Feel Touch Without Artificial Skin

A team of researchers has developed a technological breakthrough that allows robots to have a sense of touch without the need to use artificial skin or tactile instrumentation, opening up a whole range of possibilities for physical interaction between humans and machines.

This new approach, the development of which was led by Maged Iskandar, from the Institute for Robotics and Mechatronics at the German Aerospace Centre DLR, allows robots to detect and interpret human contact without the need to cover their surface with biomimetic skins or special, high-value sensors.

 

‘The intrinsic sense of touch that we propose in this work could serve as the basis for an advanced category of physical human-robot interaction that has not yet been possible, allowing a shift from conventional modalities to adaptability, flexibility and intuitive operation,’ the authors emphasized.

The details of the study were published today in the journal Science Robotics.

The sense of touch is a property that allows humans to interact delicately with their physical environment.

To physically interact with humans, robots must be equipped with sensitive but durable sensors that can detect the force applied, which can be expensive and complicated when dealing with large or curved robotic surfaces.

To overcome these challenges, Iskandar’s team used the instrumentation already integrated into the Safe Autonomous Robotic Assistant system, a robotic arm with high-resolution ‘force and torque’ sensors in its joints which, as well as registering the force being applied, measure position and guide movement.

Thanks to the sensors and artificial intelligence, the robot is able to detect where and in what order it is touched by a human and thus sensitively perceive its surroundings and precisely localize the tactile trajectories applied in time and space on its surface.

The researchers combined this ability with various learning algorithms to interpret the touch applied and showed that the robot could recognize numbers or letters traced on its surface using neural networks.

So if a human being draws the number six on the robot, the technology is able to interpret that the number is in fact a six.

In addition, the team expanded this mechanism to include ‘virtual buttons’ or sliders on the robot’s surfaces that could be used to activate specific commands or movements.

The authors suggest that this approach provides the system with an intuitive and precise sense of touch and increases the range of possible physical interactions between humans and robots, opening up ‘unexplored opportunities in terms of intuitive and flexible interaction’.

Leave a Reply

Your email address will not be published. Required fields are marked *

RECENT POSTS