Students Present Research Projects at Northeast Bioengineering Conference

Two student projects will be presented as part of the undergraduate design competition in 40th Annual Northeast Bioengineering Conference (NEBEC) April 25 to 27 in Boston under the guidance of Dr. Jaydip Desai, assistant professor of biomedical engineering.

Human Forearm Myoelectric Signals Used for Robotic Hand Control: Electromyography (EMG) can be used for many applications, one of which is to control prosthetic devices.  However, before these EMG signals can be effectively used, they must first go through a signal processing procedure including amplification, filtering, rectification, and analog-to-digital conversion. Students Jordan Roell and Jessica Sikula proposed a useful circuit design for controlling a robotic hand by means of surface EMG (sEMG) signal acquisition from the forearm.

During the project, surface electrodes were placed on five muscles of the forearm.  While the various forearm muscles were contracted, the signals obtained were processed through the circuit and converted to a digital signal to control the servomotors of the robotic hand. There are five servomotors to control each finger and thumb of the robotic hand. Pulse-Width Modulation (PWM) was used to communicate between the microcontroller and servomotors.

The data obtained from this project has been analyzed and it was determined that a useful, low-cost method for controlling a robotic hand through the use of sEMG signal acquisition has been achieved.

Emotiv EEG Headset Used for Robotic Hand Control: The Emotiv™ electroencephalogram (EEG) headset is a non-invasive brain-computer interface with 14 EEG channels that was used to record neural responses to stimuli in five male and five female participants.

Student Adrienne Kline worked on this project in which the stimuli included instructions (presented in both visual and auditory format successively) of ‘”right’” or “left.” Participants raised their right or left arm in response to the instruction given. The subjects were stimulated and their data was recorded using a scenario generated in OpenViBE™. In addition, using EEGlab™ in conjunction with Matlab®, the electrodes most important during visual and auditory stimulation were identified. This aided in the development of processing real-time EEG signals from the Emotiv™ EEG headset. Filtering and feature extraction were accomplished via the SIMULINK® signal processing toolbox. An Arduino™ microcontroller acted as a connection between the Mecha TE Hand™ and the Emotiv™ EEG headset.