The technology also has a motion sensor system to detect arm movements. This enables a user to control how the robotic hand grasps different sized objects, through a simple sequence of muscle flexes and arm movements.
The research was carried out by Dr Ravi Vaidyanathan and Samuel Wilson, a research postgraduate, who are both from the Department of Mechanical Engineering at Imperial College London. It builds on the work of Dr Richard Woodward, a former student of Dr Vaidyanathan’s, who was a Dyson Foundation scholar at the College.
The team caution that they have not yet carried out patient clinical trials with the technology. They still have a number of refinements to complete before the sensor and motion tracking system can be commercialised. However, the researchers believe their work is a step forward in making robotic prosthetics more robust in their design, which could widen their usefulness for patients.
Dr Vaidyanathan said: “Put your ear against your bicep and use your hand to form a seal between it and your arm. Now flex your arm and you should be able to hear your muscles rumbling and gurgling. We have developed a sensor technology that sits on your arm to detect these rumbles and another device that detects arm motions. This technology takes us a step closer to providing prosthetics that are potentially more robust, accessible and easier for patients to use in the future.”
The team have carried out a preliminary demonstration of the system with Alex Lewis who lost both legs, his right arm and use of the other arm following a rare infection.
To operate the robotic hand Lewis had a small arm band placed around his bicep, which has a muscle sensor and motion tracking electronics embedded into it. When he flexed his bicep the vibrations made were detected by the sensor, interpreted as signals and transmitted to a computer. A program then executed a mathematical algorithm designed to isolate Lewis’s muscle signal and filter out other arm motions and sounds, converting it into a command for the robotic hand.
Lewis had the option of activating two different grip modes while the prosthetic lay detached from his arm on a lab bench. The first was a three fingered pinch that could enable movements such as picking up a small object like a set of keys. The other is a power grip that could enable the prosthetic to grasp a larger object such as a glass of water.
Future refinements to the technology will include isolating the range of vibration interference that may make the hand open or close unexpectedly. The team also plan to refine the device so that it is more portable and enables the user to self-calibrate. This would remove the need for engineers in the set-up process. Adding more sensors would also expand the range of commands, so that the prosthetic can do more complex grasping tasks.
Lewis, who underwent amputation surgery following very rare and devastating infection called Strep A Toxic Shock Syndrome, said: “It is really exciting to be part of this project to test the robotic hand system. Current prosthetics can be very cumbersome and impacting the body, so any technology that can reduce the burden on users is an important step forward. I look forward to seeing this device developed further.”
In the longer term, the team predict that the sensor system could also be adapted so that it could be used to control other technologies and appliances around the home, to further benefit people living with disabilities. They are also working with Imperial Innovations, which commercialises College research, to establish a start-up company to market the sensor and motion tracking technology.