Date of Award


Document Type



Santa Clara : Santa Clara University, 2018.


Bioengineering; Computer Engineering; Electrical Engineering

First Advisor

Sally Wood

Second Advisor

Ahmed Amer

Third Advisor

Yuling Yan


Most products that implement features for user interactions utilize buttons or switches for the user to command to select actions to perform. Such designs are typically controlled with direct motions, such as touch or voice and are seldom designed in consideration of those unable to utilize direct control. In this project, we designed technology that reads naturally occuring biosignals from the body, which then can be apply those signals with any interface. For our specific application in this project, we decided to implement a keyboard. Instead of teaching the fingers how to type on a mechanical keyboard, the body can designate an action with a more native motion. We aim to take ‘body language’ to the next level. By making the human body the centerpiece, and building the interconnects between people (reading and comprehending EMG signals), we strive to create a more interconnected world. Using our custom implementation of an analog to digital converter, the amplitude of EMG signals at carefully placed muscle probes are collected and translated into a digital signal. The resulting signal values are sent to a remote server where key characteristics are calculated. The backend of the system consists of a mathematical model that continuously uses these calculated characteristics to re-parameterize itself for recognition. After the signals are recognized, they are assigned an appropriate output at the user’s request. This document includes the requirements, design, use cases, risk tables, workflow and the architecture for the device we developed.