Show simple item record

dc.contributor.advisorPopa, Dan
dc.contributor.advisorAlexandrakis, Georgios
dc.creatorYetkin, Oguz
dc.date.accessioned2016-10-25T19:42:20Z
dc.date.available2016-10-25T19:42:20Z
dc.date.created2016-08
dc.date.issued2016-08-19
dc.date.submittedAugust 2016
dc.identifier.urihttp://hdl.handle.net/10106/26133
dc.description.abstractModern robotic prosthetic devices for upper limb amputees promise to alleviate an important disability, but are underutilized due to the inability to properly control them. Specifically, the devices afford more degrees of freedom (DOFs) than are con- trollable by easily decoded biological signals. These devices, such as the DEKA arm, can have as many as 18 DOFs, although six is a more typical number (control of each finger plus thumb rotation). Unfortunately, the use of these devices remains limited by the ability of users to simultaneously control more than one degree of freedom at a time with commercially deployed technology. Control of robotic prosthetic devices is typically achieved through electromyogram (EMG) signals read from the residual limb. While several groups have reported being able to use multiple EMG sensors to classify the user intent from residual mus- cle activity, such systems have not proven robust enough to translate to clinical use and are not intuitive. In the first part of this research, the prosthetic control problem is re-framed as a Human Robot Interface problem, developing and clinically evaluating several robotic interface methods which can eliminate or complement the use of EMG signals while allowing the user to quickly achieve more grasping patterns, thus allowing the use of all the DOFs available in the prosthetic device. Three healthy limb based methods have been developed and evaluated, including: 1) the use of the healthy hand to tele- operate the prosthetic device via a Mirroring Glove, 2) the use of the healthy hand to issue pre-programmed commands to the prosthetic device via a Gesture Glove and 3) the use of the healthy hand with extremely light fingernail worn devices to issue commands to the prosthetic device. In the second part of this research, a field-deployable and easy way of training a multiple input based EMG classifier is presented and extended to using Force Myography (FMG) data fused with EMG data. Overall, a number of different experiments were conducted with a total of 20 human subjects, including 2 amputees, and the following conclusions were reached: 1) Healthy limb based prosthetic device control can match the performance speed of EMG based control with very little training 2) Gesture based control of the healthy limb is faster than mirrored teleoperation except in the case of tasks which are mirrored by their nature 3) Bilateral hand movements combined with kinematic tracking of the healthy limb can be utilized to train a Force Myography (FMG) based classifier as well as an EMG based classifier, and that the combination of the two modalities hold promise to make a readily deployable multi-DOF EMG/FMG classifier system a reality.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectProsthetic device control
dc.subjectFingernail based gesture tracker
dc.subjectGesture tracking
dc.subjectProsthetics
dc.subjectElectromyography
dc.subjectForce myography
dc.subjectEMG
dc.subjectFMG
dc.subjectMulti degree of freedom prosthetic device
dc.subjectNeural networks
dc.titleINTUITIVE HUMAN ROBOT INTERFACES FOR UPPER LIMB PROSTHETICS
dc.typeThesis
dc.contributor.committeeMemberKim, Young-Tae
dc.contributor.committeeMemberAthitsos, Vassilis
dc.degree.departmentBioengineering
dc.degree.nameDoctor of Philosophy in Biomedical Engineering
dc.date.updated2016-10-25T19:44:28Z
thesis.degree.departmentBioengineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy in Biomedical Engineering
dc.type.materialtext
dc.creator.orcid0000-0001-8602-2339


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record