ATTENTION: The works hosted here are being migrated to a new repository that will consolidate resources, improve discoverability, and better show UTA's research impact on the global community. We will update authors as the migration progresses. Please see MavMatrix for more information.
Show simple item record
dc.contributor.advisor | Makedon, Fillia | |
dc.creator | Lioulemes, Alexandros | |
dc.date.accessioned | 2017-10-02T13:42:59Z | |
dc.date.available | 2017-10-02T13:42:59Z | |
dc.date.created | 2017-08 | |
dc.date.issued | 2017-08-08 | |
dc.date.submitted | August 2017 | |
dc.identifier.uri | http://hdl.handle.net/10106/26945 | |
dc.description.abstract | A traffic accident, a battlefield injury, or a stroke can lead to brain or musculoskeletal injuries that impact motor and cognitive functions and can drastically change a person's life. In such situations, rehabilitation plays a critical role in the ability of the patient to partially or totally regain motor function, but the optimal training approach remains unclear. Robotic technologies are recognized as powerful tools to promote neuroplasticity and stimulate motor re-learning. Moreover, they deliver high-intensity, repetitive, active and task-oriented training; in addition, they provide objective measurements for patient evaluation.
The primary focus of this research is to investigate the development of a safe human-robot interaction assessment and training system by utilizing physiological, kinematic and dynamic modalities. Such system places the user in the robot's control loop, by feeding back a patient's biomechanical, physiological and cognitive states. A proposed vision-based upper-limb monitoring system and a developed adaptive haptic guidance control mechanism will involve human intentions to generate adaptive perception and behaviors for the Barrett WAM robotic arm. To facilitate this, a combined integration of computer vision, artificial intelligence, and human-robot interaction research are employed on the multi-sensing robotic platform.
Computational methods for a multimodal upper-limb robot-aided system are proposed in this dissertation; first, a virtual reality environment that assesses the user's physiological and psychological stages; second, an interface capable of estimating a patient's performance utilizing motion analysis and pattern recognition methods; third, an unobtrusive method for reconstructing upper-limb kinematics during robot-aided tasks with end-effector machines using Microsoft Kinect's skeletal tracking is presented and experimentally validated; fourth, an adaptive haptic guidance robotic controller is employed to modulate the complexity of the assigned motor tasks and increase the hand coordination abilities of the user. Finally, we show seven applications of robots in assistive environments, and we present the human-robot interaction usability case studies that are critical evaluation components of this thesis. | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en_US | |
dc.subject | Robotics | |
dc.subject | Haptics | |
dc.subject | Computer vision | |
dc.subject | Artificial intelligence | |
dc.subject | Human-robot interaction | |
dc.title | An Intelligent Multimodal Upper-Limb Rehabilitation Robotic System | |
dc.type | Thesis | |
dc.degree.department | Computer Science and Engineering | |
dc.degree.name | Doctor of Philosophy in Computer Science | |
dc.date.updated | 2017-10-02T13:45:07Z | |
thesis.degree.department | Computer Science and Engineering | |
thesis.degree.grantor | The University of Texas at Arlington | |
thesis.degree.level | Doctoral | |
thesis.degree.name | Doctor of Philosophy in Computer Engineering | |
dc.type.material | text | |
dc.creator.orcid | 0000-0001-7278-4922 | |
Files in this item
- Name:
- LIOULEMES-DISSERTATION-2017.pdf
- Size:
- 59.99Mb
- Format:
- PDF
This item appears in the following Collection(s)
Show simple item record