Research Team: Dr Ian WilliamsAdrien RapeauxDr Timothy Constandinou

Collaborators: Dr Kianoush Nazarpour (Newcastle University), Dr Francisco Sepulveda (Essex University), Dr Luidi Jiang (University of Southampton), Dr Ed Chadwick (Keele University), Dr Paul Steenson and Dr Rory J O'Conner (University of Leeds)

Funding: Engineering and Physical Sciences Research Council (EPSRC) EP/M025977/1

Project website: www.senseback.com


An artificial arm, or prosthesis, is an example of technology that can be used to help somebody perform essential activities of daily living after a serious injury that results in the loss of their arm. Such activities might include eating, washing, opening doors, or shaking hands with a friend. Many artificial arms on the market these days are highly sophisticated, offering individual finger movement, and even movement of segments within a finger, that resemble the natural arm and hand.

These prosthetic arms are often controlled by sensing the contractions in the muscles of the remaining arm to which the prosthesis is attached, allowing the user to operate the arm by flexing their muscles. However, one key aspect of artificial arms that is currently missing is the sense of feedback. In other words, the user does not know where the arm is or how wide open the hand is without looking at it, and if a delicate object is picked up, there is no sense of how hard it is being gripped. This leads to slow and awkward use of the artificial arm and prevents its use from becoming truly natural.

The goal of this project is to develop technologies that will enable the next generation of assistive devices to provide truly natural control through enhanced sensory feedback. Our long-term vision is for artificial arms that provide the user with a sense of feedback that recreates the natural feedback associated with a real arm.

To enable this level of feedback, we must meet two clear objectives: to generate artificial signals that mimic those of the natural arm and hand, and to provide a means of delivering those signals to the nervous system of a prosthesis user. 

These objectives will be achieved by: building new fingertip sensors to give the prosthesis a realistic sense of touch, including pressure, shear and temperature; developing a 'virtual hand' that mimics the nerve impulses that would be produced by a real hand, giving the user a sense of position of an artificial hand; and designing electrodes and a stimulation system that can deliver the simulated nerve impulses directly to the individual's nervous system.

Building this level of feedback into prosthetic devices will enable much higher levels of function to be achieved than is currently possible. Device users would be able to naturally reach out and pick up a glass, for example, whilst maintaining eye contact in a conversation, or pick up an apple without bruising it. This will advance the field of prosthetics, provide enhanced function to prosthesis users and decrease the learning time involved when acquiring a new device.

Relevant Publications