Interactive-ASL-Recognition

Using the framework mediapipe made by google, OpenCV library and through self teaching, I was able to create a program that takes in the hand gesture displayed by the user and translates ASL. It uses 2 main modules in backend, palm detection and hand landmark, where palm detection takes a cropped image of the hand and hand landmark recognizes 21 different landmarks on the hand and reports its position. Using specific condition statements involving different values of landmarks, I was able to tell the computer which hand gesture signifies which ASL alphabet. For example, for A, I had to tell the computer if the landmark on the tip of the thumb is greater than the landmarks of the other fingers than it should output A.

For the program to work, make sure to install the packages mediapipe and opencv.

GitHub

View Github