IMPROVED SIMILARITY MEASURES FOR AMERICAN SIGN LANGUAGE RECOGNITION USING MANUAL HANDSHAPES AND HAND APPEARANCES
View/ Open
Date
2016-05-10Author
Swaminathan, Siddhartha Goutham
0000-0003-2706-5827
Metadata
Show full item recordAbstract
American sign language is a primary language for approximately 0.5 to 2 million people who are deaf or hard of hearing in the United States[22][23]. When a user encounters an English word which he does not understand, he looks up the meaning for it in a dictionary. However, when an American Sign Language (ASL) user encounters an unknown sign, looking up the meaning of that sign is not an easy task. There are many systems available to access ASL signs that require articulatory properties such as handshapes but these systems fail if there is a slight variation of what the user is looking from the actual ASL dictionary. The existing system proposes a baseline similarity measure based on dynamic time warping(DTW)[8] where feature vectors are extracted based on trajectory of the hand and DTW is applied on this time series of feature vectors to obtain the similarity measure between the query sign and the database of videos. Handshape is also one of the essential component that makes up an American sign language along with the trajectory information. However it is not easy to recognize different hand shapes. The system implemented here performs improvements on methods proposed by [3] [2] and [5] by incorporating hand shape information from the signer. The Goal is to evaluate methods based on two topics, one based on Manual handshape inputs from the user which is considered to be a best case, where the user manually spends more time specifying the hand shapes that constitute a sign for attaining better accuracy and other method based on hand appearances. we also investigate how well the method based on hand appearances fare with the near perfect accuracy method of manual hand shapes.