Glove Based American Sign Language Interpretation Using Convolutional Neural Network And Data Glass

2020 IEEE REGION 10 SYMPOSIUM (TENSYMP) - TECHNOLOGY FOR IMPACTFUL SUSTAINABLE DEVELOPMENT(2020)

引用 3|浏览0
暂无评分
摘要
To live in a society, it is very important to communicate with each other. But this poses a grave problem for people with hearing disabilities. As they can converse using only sign language, it becomes very difficult for others who don't know the sign language to understand them. So, the purpose of this paper is to create an interpreter which can convert american sign language into the Engilsh language. Through convolutional neural network we were able to create such an interpreter which can interpret the american sign language.The Sign Language interpretation system was created using image processing, convolutional neural network and data glass. First, we created a database of 16500 samples (500 samples for each letter/gesture) for the basic sign language letters and gestures. Then using convolutional neural network, we were able to create a model which can distinguish between 33 hand gestures (26 English alphabets and 7 gestures).The images are taken through a raspberry-pi camera and then they are processed and matched against the model by a raspberry pi and the interpreted result is shown through a data glass.
更多
查看译文
关键词
American sign language, image processing, convolutional neural network, data glass
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要