This is the extraordinary moment a man was able to pick up a water bottle and pour himself a drink remotely using a cutting edge robotic hand.
The gadget works using a series of sensors which are attached to the base of the arm or stump, in the case of amputees.
They read muscle movements and send signals to the prosthetic hand, giving users control of every finger of the machine, as well as the ability to grasp and pick up items.
It is hoped the technology, which reacts to user movements within 0.4 of a second, could end the daily struggle of millions of amputees.
The robotic hand works using a series of sensors which are attached to the base of the arm below the elbow
Equipped with pressure sensors all along the fingers, the robotic hand can react and stabilise an object before the brain realises that it is slipping
Professor Aude Billard, one of the researchers behind the design at the EPFL research centre in Lausanne, Switzerland, are behind the design
The prosthetic hand uses machine learning to become familiar with the user’s muscle movements. The amputee must perform a series of hand gestures in order to train the algorithm.
Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity.
Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.
Researchers at the EPFL research centre in Lausanne, Switzerland, are behind the design.
They successfully trialled the robotic hand on three amputees and seven healthy participants.
Writing in the study, the research team said the technology ‘merges two concepts from two different fields of neuroprosthetics’.
One of the amputees controls a virtual robotic hand during the study while the sensors are hooked up to his stump
Sensors attached to the arm send signals to the prosthetic hand, giving users control of every individual finger of the machine, as well as the ability to grasp and pick up items
Lead author Katie Zhuang poses with the hand, which can react to user movements in a split second
They added: ‘One concept, from neuroengineering, involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand which has never before been done.
‘The other, from robotics, allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping.’
Lead author Katie Zhuang said: ‘Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements.’
Equipped with pressure sensors all along the fingers, the robotic hand can react and stabilise an object before the brain realises that it is slipping.
Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory, said: ‘When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react. The robotic hand has the ability to react within 400 milliseconds.’