Skip To Main Content
Dr. Maryam Zahabi, a woman with long brown hair and round glasses, stands and points to a computer screen. A man with wavy black hair and round glasses, wearing a white lab coat, sits beside her and watches.
Zahabi and her team are using virtual reality as well as a new driving simulator inside the Emerging Technologies Building to test new prosthetic interfaces. | Image: Dharmesh Patel

Reaching for something on the top shelf at the grocery store or brushing one's teeth before bed are tasks many people can do without much thought. But performing these same tasks as an upper limb amputee using a prosthetic device can require a lot more mental effort.

Machine learning algorithms and computational models can provide insight into the mental demand placed on individuals using prosthetics. Dr. Maryam Zahabi and her team are using these models to improve the current interface in prosthetic devices by studying prosthetics that use an electromyography-based human-machine interface.

Individual uses an upper arm prosthetic to hold a pen.
Over half of upper limb amputees abandon prosthetic devices due to frustration during use. | Image: Getty Images

Electromyography (EMG) is a technique that records the electrical activity in muscles. This electrical activity generates signals that trigger the interface, which translates them into a unique pattern of commands. These commands allow users to move their prosthetic devices.

“There are over 100,000 people with upper limb amputations in the United States,” Zahabi said. “Currently there is very little guidance on which features in EMG-based human-machine interfaces are helpful in reducing the cognitive load of patients while performing different tasks.”

Testing different interface prototypes, through virtual reality and driving simulations, will allow researchers to provide guidance to engineers creating these interfaces. This will lead to better prosthetics for amputees and other technological advances using EMG-based assistive human-machine interfaces.

This research is a collaboration between Texas A&M, North Carolina State University and the University of Florida and is supported by the National Science Foundation.

One-of-a-kind driving simulator

The Department of Industrial and Systems Engineering at Texas A&M University installed a new driving simulator to use in research pertaining to driving, autonomous vehicles and innovative research like Zahabi’s. It is a one-of-a-kind feature on campus that can be driven manually or autonomously, with a 270-degree field of vision. Due to the many different types of research that might require a driving simulator, interdisciplinary teamwork is almost inherent in any project that incorporates this technology.