Project Summary
499 Design Project - University of Victoria

Communication barriers between the hard of hearing or deaf, and hearing able exist in countless situations. Solutions such as hearing aids and sign language are more and more prevalent in today's world. The popularity of American Sign language in particular rivals many spoken languages in North America, however like spoken languages a mutual understanding between all parties is needed in order to fluently communicate. Our 499 Design Project creates a solution to overcoming this barrier.

Our project utilizes the Microsoft Kinect to translate American Sign Language (ASL) into text via personal computer. We accomplish this in three stages; data aquisition; data analysis; and results displayed via our graphical user interface on a computer screen.

The first stage, data acquisition, uses the Kinect's infrared sensor to triangulate positional data of an entire 3-D environment. Using OpenNI libraries we are able to track a user's hand movements in real time. We record movements in data sets consisting of 100 frames, each of which contains 3-D left and right hand coordinates at an instant in time.

Once a data set is captured we begin analysis in three stages. The first stage computes in 3-D our prominent axes and calculates the range of motion committed over an instance in time and compares this to our prerecorded library. Second, we do a point to point comparison versus each entry in our library creating a percentage error for each point. Third we utilize a 3-D pattern matching optimization algorithm to examine our 3-D data set versus our library. Using all three methods we weight each independently and use the culmination of error to determine if a user has generated a legitimate sign or not.

Upon determination of a correct sign, we transmit the word of the sign via text into a OpenGL environment and onto the computer screen. Our program also provides tracking and video feedback in separate windows.

Details of our design, implementatoin and source code can be found below in our Progress Reports #1, #2 and Final Report.


Progress Report #1

Progres Report #1 illustrates our proposed design of the ASL Translator using 3D video processing.

Download PDF

Progress Report #2

Progress Report #2 illustrates the progression we had made after 4 weeks into the project. We discuss various problems we encountered and the proposed solutions we intended to implement.

Download PDF

Final Report

Our Final Report outlines our project design. Included are our initial designs and details pertaining to the procedures we use.

Download PDF






Image Gallery

Long Hours Testing Kinect Hardware Breakdown
Testing Setup 3-D Plot Call The Ambulance!

The OpenNI organization is an industry-led, not-for-profit organization formed to certify and promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware.

Learn more at OpenNI.org...

PrimeSense's NITE Middleware focuses on enabling Natural Interaction™ in the living room using two key applications: Control by Gesture and Games for All.

Learn more at Primesense.com...