Show simple item record

dc.contributor.advisorHuber, Manfred
dc.creatorPham, Tien
dc.date.accessioned2023-06-14T17:07:13Z
dc.date.available2023-06-14T17:07:13Z
dc.date.created2023-05
dc.date.issued2023-05-22
dc.date.submittedMay 2023
dc.identifier.urihttp://hdl.handle.net/10106/31269
dc.description.abstractHuman-Computer Interfaces (HCI) is an essential aspect of modern technology that has revolutionized the way we interact with machines. With the revolution of computers and smart devices and the advent of autonomous vehicles and other machines, there has been a significant advancement in this area that brings convenience to users to interact with technology intuitively and efficiently. However, the importance of HCI goes beyond the convenience of everyday technology. It has become crucial in the development of assistive technologies that empower people with disabilities to live more independently. Person with disabilities, who lack control of one or more parts of their physical body or who have mental limitation have to interact with the machine in a special often very custom way that match their individual capabilities. One common machine that many people with severe physical disabilities have to interact with every day is the wheelchair which has been used for decades to facilitate their lives. While many times the common simple interfaces which are usually available on wheelchairs, such as joysticks or sip and puff interfaces, are sufficient, they are difficult to use for persons with severe disabilities who do not have proper control of their hands or are inconvenient and hard to utilize. This need as well as the quest for more intuitive, less overhead control leads to research for other ways to interact with a wheelchair in the context of partially autonomous navigation. In this thesis, a context-aware gaze-based interface is developed to allow users to control the wheelchair naturally without translating the user's eye gaze input to specific commands. The system can estimate eye gaze directions and analyze the location users are looking at to obtain the context for inference of user navigation intention. A navigation detection model is also embedded into the system that can distinguish between users' navigation intention, navigation-related attention or non-navigation attention to serve as a driver of semi-autonomous smart wheelchair systems.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectHCI
dc.subjectSmart wheel chair
dc.subjectGaze estimation
dc.subjectMachine learning
dc.titleContext-Aware Gaze-Based Interface for Smart Wheelchair
dc.typeThesis
dc.date.updated2023-06-14T17:07:13Z
thesis.degree.departmentComputer Science and Engineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelMasters
thesis.degree.nameMaster of Science in Computer Science
dc.type.materialtext
dc.creator.orcid0000-0001-5632-9209


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record