SignSense is a tool meant to help people who are attempting to learn sign language independently. Unlike with many verbal languages, there are no programs like Duolingo that can give you real-time feedback on grammar and pronunciation. SignSense aims to confront this problem by using motion sensors that can track the user's movements and send them to a learning program on the user’s computer. More specifically, I used a combination of flex sensors and an absolute orientation sensor attached to a glove to track my movements. Then those sensor values were recorded in a microcontroller and then sent to the program on my computer that creates a 3D model to mimic my hand. The 3D model is to help learners understand what they should be doing with their hands. My hopes for this project are that it makes the prospect of learning sign language less intimidating and therefore increases the number of people learning.
More information about my code, circuit, and CAD Design can be found HERE on my project summary page