Recently, a team of researchers has developed a new technology to decode neuromuscular signals in a bid to control prosthetic wrists and hands. The work mainly depends on computer models, which closely imitate the behavior of natural structures in the wrist, forearm, and hand. The technology will also be utilized in creating new computer interfaces for applications, such as computer-aided design (CAD) and gaming apps. In early testing stages, the technology has worked just fine but it is yet to enter the phase of clinical trials, which means it is years away from commercial production. The research work was led by a team of researchers from the University of North Carolina at Chapel Hill and North Carolina State University in the joint biomedical engineering program.
The existing state-of-the-art prosthetics depend upon machine learning (ML) to develop a pattern recognition approach to control prosthetics. In this approach, users teach the device to identify specific patterns of muscle activity and then translate them into various commands, such as open or close a prosthetic hand. He (Helen) Huang, a professor in the program and says Huang and senior author of a paper on the work, states that the pattern recognition control needs patients to undergo a lengthy course of training their prosthesis, which can be time-consuming and tedious, both. “We wanted to concentrate on what is already known about the human body. This is not only more instinctive for users, but also more practical and reliable,” he added further.