Show simple item record

dc.contributor.advisorHuber, Manfred
dc.creatorVilla, Francisco
dc.date.accessioned2022-01-25T18:29:31Z
dc.date.available2022-01-25T18:29:31Z
dc.date.created2021-12
dc.date.issued2022-01-06
dc.date.submittedDecember 2021
dc.identifier.urihttp://hdl.handle.net/10106/30244
dc.description.abstractProgramming robot systems to perform certain tasks is a big challenge especially if such programming is to be performed by persons who are not experts in robotics. For example, when programming a robot to serve as an exercise trainer, the person defining the motions might more naturally be a person in the exercise domain rather than a robotics expert. To address this, this thesis investigates programming by demonstration or teleoperation using full direct body motion. The goal is to reproduce gaits, gestures, and postures on a humanoid robot from observed human demonstrations. Fine motor movements such as movement of fingers will be excluded from this thesis’ paradigm. Mimicking said such movements is straight forward if human and robot dynamics and kinematics are the same. The robot can move exactly like how his human demonstrator moves. This, however, is never the case, thus leading to a multidimensional correspondence problem. In this thesis an approach is presented that attempts to resolve this by addressing four components. First it addresses linking the degrees of freedom of two similar but different bodies, the human, and a humanoid robot. Second, it handles linking the image frame related to the observation of the demonstrator to the demonstrator structure to be able to track locations that have changed, using 3D computer vision with a human skeleton model. As such skeleton observations are usually noisy, a third process is aimed at of filtering out sensor noise and recognition errors, resulting in an observed motion trajectory. Lastly, to account for the differences between the human and robotic bodies both in terms of kinematics and dynamic stability, a modeling and learning framework, PILCO, is adapted to address mapping into an executable imitation that obeys the stability requirements and limitations of the humanoid robot system.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectRobotics
dc.subjectMachine learning
dc.titleADAPTIVE HUMAN-ROBOT MOTION TRANSFER FOR COMPLETE BODY IMITATION
dc.typeThesis
dc.degree.departmentComputer Science and Engineering
dc.degree.nameMaster of Science in Computer Engineering
dc.date.updated2022-01-25T18:29:31Z
thesis.degree.departmentComputer Science and Engineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelMasters
thesis.degree.nameMaster of Science in Computer Engineering
dc.type.materialtext
dc.creator.orcid0000-0003-2309-847X


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record