Table of Contents

Smart Glove for interpreting ASL (Sign Language) alphabet

Author: Andrei-Alexandru Georgescu
Group: 332-CD

Introduction

The intent is to make a hand-worn device that uses several sensors to internally store the position of each finger, as well as orientation measured with a gyroscope, and then using this data to interpret, based on set of logistic regressions, a corresponding letter according to the ASL standard alphabet.
The ML based interpretation of the gestures makes the device highly personalizable, as it could be trained for a specific individual's range of motion and manner of making the gestures.
Its primary and intended purpose is to interpret gestures, but it can also be used, once trained, in an educational fashion as it uses an LED to signal if the gesture is recognized.

General description


An important part of the project's logic is based on the 6 IMUs used. One is the “Reference Sensor” and it is mounted on the wrist such that it doesn't much experience the rotational movements of the hand. The other 5 are each mounted on a fingertip and all funnel their outputs towards the controller through an I²C multiplexer. The microcontroller is going to in turn request data from each of the sensors over the I²C line, and use the values to determine the position of each fingertip relative to the wrist.

The intended function of the project has 2 possible running modes, which determine how the positional data is used:

Hardware Design

Components:

Conexions: In my project I used the following color coding:

The Pins used are as such:

Physical Wiring:

Software Design

Development Medium: Arduino IDE Libraries:

Algorithms and Datastructures:

Implemented Functions:
TCA9548 Functions:


MPU6500 Functions:


LCD1602 Functions:


Interpreter Functions:


Main Functions:

Results

The overall result is a system that can be configured for either gesture interpretation or gesture learning. Functionally the ESP32 used could learn even more complex or larger matrices to continue using multionomial regression for more potential classes, learning new gestures that don't strinctly have to me mapped onto a character. The ESP32 has bluetooth/wifi functionality so it is completely plausible to modify a system like this to interact with smart devices by issuing commands. The choice to interpret sign-language is arbitrary. For my current learning set (30 entries for each gesture) inference analysis yields great results:

Conclusions and Lessons

Code Repository

Available here!