
Bio-robotics Labs
In my final semester of college, I had the privilege to take a fairly new class in Bio-robotics and cybernetics. The bulk of the class was spent teaching machine learning techniques, algorithms, and uses for the marriage of biology and technology but what lacked seemed to be the hands-on application of these techniques. This is where the lab stepped in and rounded out the whole experience. The following information details the four main labs studying bioelectrical signal collection, processing, and analyzation primarily using self-developed (or modified) machine learning algorithms for classification purposes.

Myo Armband
EMG Signals
The focus of this lab involved studying how forearm muscle movements could be received by a set of surface electrodes and turned into commands to a computer. These electrodes were contained within the Myo armband, for first time simplicity which was followed by the BioRadio device for a more accurate feel of needing to properly place electrodes. Using pre-made software, the Myo was used to control the cursor on the computer screen by using generic hand gestures like wrist twists, flexes, and curled fists. Once comfortable with this, data was obtained and processed using bandpass and lowpass filters along with rectification and an envelope detector in order to try and classify the gestures of the game rock, paper, scissors.
BioRadio
EOG Signals
This lab, using the same Bioradio as the previous lab, involved the studying of voltage signals obtained from the movement of the eyes. Placing the electrodes on each temple, and one above and below the right eye, data was collected and processed in order to see if the different eye movements were able to be tracked. For proof of operation, and for fun, a game was played using the signals to move a spaceship through an asteroid field much like the arcade game Asteroids.


BioPac BioHarness
ECG Signals
Of all of the labs in this class, this required the most activity but the least processing. Tracking EKG electrical signals while performing runs, jogs, pushups, squats, and sit ups, this data was processed to see if there was a difference in the data for each of the actions. The results concluded minimal differences other than the running and jogging portions could be identified separately from the rest of the actions.
Unicorn Brain Cap
EEG Signals

Utilizing the Unicorn Brain Cap, this lab involved using the EEG signals from the brain to control and track the thought processes behind identification of cat and dog images. This was done by recording EEG signals from positions around the scalp, highlighted in yellow above, as cat and dog images flashed on a computer. By counting the cat images, the brain was supposed to have an electrical firing for every identification of cat which should be trackable. This was run through intensive processing to eliminate 60Hz noise and other undesirable artifacts within the signal. Results concluded fairly accurate detailing of proper count of images at approximately 75% accuracy.