ATTENTION: The works hosted here are being migrated to a new repository that will consolidate resources, improve discoverability, and better show UTA's research impact on the global community. We will update authors as the migration progresses. Please see MavMatrix for more information.
Show simple item record
dc.contributor.author | Hazra, Sudip | |
dc.contributor.author | Whitaker, Shane | |
dc.contributor.author | Shiakolas, Panos S. | |
dc.date.accessioned | 2023-11-06T17:01:35Z | |
dc.date.available | 2023-11-06T17:01:35Z | |
dc.date.copyright | ASME ©; CC-BY distribution | |
dc.date.issued | 2023-05-23 | |
dc.identifier.uri | http://hdl.handle.net/10106/31839 | |
dc.description.abstract | In assistive robotics, research in Brain Computer Interface aims to understand human intent to enhance Human-
Robot Interaction and augment human performance. In this research, a framework to enable a person with an upper
limb disability to use an assistive system towards maintaining self-reliance is introduced and its implementation and
evaluation are discussed. The framework interlinks functional components and establishes a behavioral sequence
to operate the assistive system in three stages; action classification, verification, and execution. An action is classified
based on identified human intent and verified through haptic and/or visual feedback before execution. The
human intent is conveyed through facial expressions and verified through head movements. The interlinked functional
components are an EEG sensing device, a head movement recorder, a dual-purpose glove, a visual feedback environment, and a robotic arm. Five volunteers are used to evaluate the ability of the system to recognize a facial
expression, the time required to respond using head movements, convey information through vibrotactile feedback
effects, and the ability to follow the established behavioral sequence. Based on the evaluation, a personalized training
data set should be used to calibrate facial expression recognition and define the time required to respond during
verification. Custom vibrotactile effects were effective in conveying system information to the user. The volunteers
were able to follow the behavioral sequence and control the system with a success rate of 80.00%, thus providing
confidence to recruit more volunteers to identify and address improvements and expand the operational capability
of the framework. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | American Society of Mechanical Engineers | en_US |
dc.subject | Human Robot Interaction, Brain Computer Interface, Vibrotactile Haptic Feedback, Virtual Environment,
Assistive Robotics, Interaction Framework | en_US |
dc.subject | Human Robot Interaction, Brain Computer Interface,
Vibrotactile Feedback, Process Verification, Virtual Environment,
Assistive Robotics, Interaction Framework | |
dc.title | Design and implementation of a behavioral sequence framework for human-robot interaction utilizing brain-computer interface and haptic feedback | en_US |
dc.type | Article | en_US |
Files in this item
- Name:
- Design and Implementation.pdf
- Size:
- 3.942Mb
- Format:
- PDF
- Description:
- PDF
This item appears in the following Collection(s)
Show simple item record