Show simple item record

dc.contributor.advisorMakedon, Fillia
dc.creatorLioulemes, Alexandros
dc.date.accessioned2017-10-02T13:42:59Z
dc.date.available2017-10-02T13:42:59Z
dc.date.created2017-08
dc.date.issued2017-08-08
dc.date.submittedAugust 2017
dc.identifier.urihttp://hdl.handle.net/10106/26945
dc.description.abstractA traffic accident, a battlefield injury, or a stroke can lead to brain or musculoskeletal injuries that impact motor and cognitive functions and can drastically change a person's life. In such situations, rehabilitation plays a critical role in the ability of the patient to partially or totally regain motor function, but the optimal training approach remains unclear. Robotic technologies are recognized as powerful tools to promote neuroplasticity and stimulate motor re-learning. Moreover, they deliver high-intensity, repetitive, active and task-oriented training; in addition, they provide objective measurements for patient evaluation. The primary focus of this research is to investigate the development of a safe human-robot interaction assessment and training system by utilizing physiological, kinematic and dynamic modalities. Such system places the user in the robot's control loop, by feeding back a patient's biomechanical, physiological and cognitive states. A proposed vision-based upper-limb monitoring system and a developed adaptive haptic guidance control mechanism will involve human intentions to generate adaptive perception and behaviors for the Barrett WAM robotic arm. To facilitate this, a combined integration of computer vision, artificial intelligence, and human-robot interaction research are employed on the multi-sensing robotic platform. Computational methods for a multimodal upper-limb robot-aided system are proposed in this dissertation; first, a virtual reality environment that assesses the user's physiological and psychological stages; second, an interface capable of estimating a patient's performance utilizing motion analysis and pattern recognition methods; third, an unobtrusive method for reconstructing upper-limb kinematics during robot-aided tasks with end-effector machines using Microsoft Kinect's skeletal tracking is presented and experimentally validated; fourth, an adaptive haptic guidance robotic controller is employed to modulate the complexity of the assigned motor tasks and increase the hand coordination abilities of the user. Finally, we show seven applications of robots in assistive environments, and we present the human-robot interaction usability case studies that are critical evaluation components of this thesis.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectRobotics
dc.subjectHaptics
dc.subjectComputer vision
dc.subjectArtificial intelligence
dc.subjectHuman-robot interaction
dc.titleAn Intelligent Multimodal Upper-Limb Rehabilitation Robotic System
dc.typeThesis
dc.degree.departmentComputer Science and Engineering
dc.degree.nameDoctor of Philosophy in Computer Science
dc.date.updated2017-10-02T13:45:07Z
thesis.degree.departmentComputer Science and Engineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy in Computer Engineering
dc.type.materialtext
dc.creator.orcid0000-0001-7278-4922


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record