Show simple item record

dc.contributor.authorHazra, Sudip
dc.contributor.authorWhitaker, Shane
dc.contributor.authorShiakolas, Panos S.
dc.date.accessioned2023-11-06T17:01:35Z
dc.date.available2023-11-06T17:01:35Z
dc.date.copyrightASME ©; CC-BY distribution
dc.date.issued2023-05-23
dc.identifier.urihttp://hdl.handle.net/10106/31839
dc.description.abstractIn assistive robotics, research in Brain Computer Interface aims to understand human intent to enhance Human- Robot Interaction and augment human performance. In this research, a framework to enable a person with an upper limb disability to use an assistive system towards maintaining self-reliance is introduced and its implementation and evaluation are discussed. The framework interlinks functional components and establishes a behavioral sequence to operate the assistive system in three stages; action classification, verification, and execution. An action is classified based on identified human intent and verified through haptic and/or visual feedback before execution. The human intent is conveyed through facial expressions and verified through head movements. The interlinked functional components are an EEG sensing device, a head movement recorder, a dual-purpose glove, a visual feedback environment, and a robotic arm. Five volunteers are used to evaluate the ability of the system to recognize a facial expression, the time required to respond using head movements, convey information through vibrotactile feedback effects, and the ability to follow the established behavioral sequence. Based on the evaluation, a personalized training data set should be used to calibrate facial expression recognition and define the time required to respond during verification. Custom vibrotactile effects were effective in conveying system information to the user. The volunteers were able to follow the behavioral sequence and control the system with a success rate of 80.00%, thus providing confidence to recruit more volunteers to identify and address improvements and expand the operational capability of the framework.en_US
dc.language.isoen_USen_US
dc.publisherAmerican Society of Mechanical Engineersen_US
dc.subjectHuman Robot Interaction, Brain Computer Interface, Vibrotactile Haptic Feedback, Virtual Environment, Assistive Robotics, Interaction Frameworken_US
dc.subjectHuman Robot Interaction, Brain Computer Interface, Vibrotactile Feedback, Process Verification, Virtual Environment, Assistive Robotics, Interaction Framework
dc.titleDesign and implementation of a behavioral sequence framework for human-robot interaction utilizing brain-computer interface and haptic feedbacken_US
dc.typeArticleen_US


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record