Recommended Citation
January 1, 2019.
Abstract
The JPL BioSleeve is a gesture control interface that is worn on the forearm, which contains an array of surface electromyogram sensors to read electrical activity created by muscle flexion and extension, and inertial sensors to measure arm motions. An embedded processor then uses machine learning algorithms to recognize activity patterns and generate control commands from hand gestures. The goal of this project is to use Matlab and Unreal Engine to convert the muscle activity to inputs used to control a simulated robot in a virtual environment. To prove viability we are creating a virtual environment with a model of the International Space Station (ISS) to navigate a SPHERES robot in an Unreal simulation. There are various other means of controlling robots, but many are cumbersome and limit the user’s ability to interact with the objects in their vicinity. By developing a wearable gesture control interface, the person controlling the robot will be able to interact with the environment and control the robot in a more intuitive nature.
Mentor
Chris Asad
Lab site
NASA Jet Propulsion Laboratory (JPL)
Funding Acknowledgement
The 2018 STEM Teacher and Researcher Program and this project have been made possible through support from Chevron (www.chevron.com), the National Marine Sanctuary Foundation (www.marinesanctuary.org), the National Science Foundation through the Robert Noyce Program under Grant #1836335 and 1340110, the California State University Office of the Chancellor, and California Polytechnic State University in partnership with NASA Jet Propulsion Laboratory. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the funders.
URL: https://digitalcommons.calpoly.edu/star/492