Show simple item record

dc.contributor.advisorHuber, Manfred
dc.creatorDatta, Debayan
dc.date.accessioned2022-06-28T15:12:47Z
dc.date.available2022-06-28T15:12:47Z
dc.date.created2022-05
dc.date.issued2022-05-17
dc.date.submittedMay 2022
dc.identifier.urihttp://hdl.handle.net/10106/30419
dc.description.abstractGesture Control as a way to replace more conventional remote-control operations has been pursued for a significant period of time with different levels of success. The use of gestures to control different types of human interfaces is today predominantly seen in the multimedia sector. People perform easy and intuitive gestures to control their televisions, to interact with multimedia, and to play games. Also, much research has been carried out to experiment with human interfaces to numerous augmented and virtual reality devices and tasks. The results of these experiments were so exciting that researchers started to expand the use of gestures to control physical real-life objects using gestures. One such gesture control platform is controlling an Unmanned Aerial Vehicle (UAV) during flight using hand movements and other types of significant gestures. While the use of gestures as an intuitive way to control such a platform is very promising, it also brings with it a number of challenges and risks that need to be addressed. The main challenges here arise from the versatility and universal character of gestures which make them more susceptible to misinterpretations in terms of user intentions in particular in the context of real-world distractions which can lead to the unintentional generation of gestures outside the context of the control task. The scope of hand gestures is so varied that predicting the gesture using a designated framework could, at a point, be risky for the UAV, where a non-gesture movement possibly generated in the context of natural or artificial distraction could be classified as a control signature and cause a dramatic failure of the UAV. For instance, if a human subject entitled to control the UAV using their hand movements is subjected to an unanticipated situation that made him busy to perform any other similar looking gesture, and the framework misinterprets it to be a known gesture, the result of executing the unintended command might end the life of that drone. In the work presented here, we have mainly focused on an instance where the human subject was considered to be subjected to different types of distraction while controlling the UAV using hand gestures. The UAV controls are suspended to avoid any self or collateral damage whenever any distraction from the brain or hand movements is identified. Finally, once distractions abate, the UAV shifts to ready mode in order to receive new commands. The experimentation is performed using wearable Electromyography (EMG) Sensor armbands and non-invasive Electroencephalography (EEG) Sensors worn on the head. The Armband is co-fitted with both EMG and Inertial Measurement Unit (IMU) Sensors responsible for extracting the muscle movement and the motion-sensing data. The EEG Sensor is used to get the brain data to identify the distracted brain sequences during the operation. An intuitive gesture set for drone operation is designed and a neural network pipeline was set up which was used to identify the gesture performed as well as the current state of the brain. The pipeline constituted a Long-Short Term Memory (LSTM) network classifying the gestures and an Anomaly Detector, which identified periods of operation that correspond to distractions where the human was likely no longer focusing on UAV operations. The LSTM was able to classify 99.52% of the gesture set correctly and the anomaly detector had an accuracy of 90.0006%. The precision of the distraction classifier was 91.86% and the recall was 93.89% for a dataset having nearly 30% of distracted data samples. The signals were recorded from a single human subject over 3 minutes intervals and 20 repetitions with a rest period of 5-10 minutes to avoid brain and muscle fatigue.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectRobotics
dc.subjectBio-sensing
dc.subjectHuman-activity data
dc.subjectArtificial intelligence
dc.subjectControl system
dc.subjectGestures recognition
dc.subjectDistraction detection
dc.subjectIntention recognition
dc.subjectNeural network
dc.subjectlstm
dc.subjectAnomaly detector
dc.titleDISTRACTION DETECTION AND INTENTION RECOGNITION FOR GESTURE-CONTROLLED UNMANNED AERIAL VEHICLE OPERATION
dc.typeThesis
dc.degree.departmentComputer Science and Engineering
dc.degree.nameMaster of Science in Computer Science
dc.date.updated2022-06-28T15:12:47Z
thesis.degree.departmentComputer Science and Engineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelMasters
thesis.degree.nameMaster of Science in Computer Science
dc.type.materialtext
dc.creator.orcid0000-0002-1188-0838


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record