People frequently use gestures to communicate. Gestures are used for everything from pointing at a person to get their attention to conveying information. Evidence indicates that gesturing does not simply embellish spoken language, but is part of the language generation process.
A gesture is defined as a movement of the body, or part of the body, to express or emphasize ideas, emotions, intentions, etc. Gesture Recognition is the act of interpreting motions to determine such intent. Cybernet has developed software for automatically analyzing motion to determine intent and recognize gestures. The gestures can then be used to command or control other software or hardware. Cybernet has also developed hardware and software for capturing movement of objects (including humans) through optical methods. GestureStorm leverages both sets of technology.
Body tracking coupled with gesture recognition provides a highly innovative and powerful interface for controlling devices. It allows users to interact with systems in a non-contact and intuitive manner. Cybernet has previously designed and built gesture recognition applications for NASA, the U.S. Army, the U.S. Air Force, DARPA, and commercial entities.
For NASA, Cybernet created a system that can recognize two-dimensional gestures and generate commands based on the recognized gesture. The commands were then used to control a multi-media kiosk. This system was showcased at NASA’s exhibit at the 2000 World Stamp Exposition.
For the U.S. Army, Cybernet created a sophisticated gesture recognition system that is capable of tracking untagged human features (such as hands) and detecting intentional gestures (including 3D gestures). This system is capable of recognizing the gestures used frequently by Army soldiers (as defined in Army Field Manual 21-60: Visual Signals). It has been integrated with a commercial training simulator used by the Army at RDECOM in Orlando Florida to enable soldiers to perform realistic gestures in an immersive environment.
For the U.S. Air Force, Cybernet created a system for tracking untagged human features under controlled conditions. The system is capable of analyzing the motion of these features to determine intentional gestures. The system was designed to control a data wall of information. Furthermore, we have worked with the Air Force to show that our gesture recognition system can be extended to perform a more sophisticated analysis called behavior recognition. To recognize behaviors, we analyze the gestures created by multiple features on a person to determine the overall activity. For example, walking will produce certain gestures for the feet, knees, etc. Crawling will produce a different set.
Cybernet recently installed a gesture recognition system in a futuristic mission command and control aircraft research and development laboratory. The commander uses this system to control displayed information on two situational understanding view ports. The commander wears a tagged glove while performing identifiable gestures to control the system.
In November 2000, Cybernet launched UseYourHead™, the gaming industry's first software product that provided a new dimension in PC game play by allowing players to use head movements as input commands. Based on Cybernet's patented gesture recognition and tracking technology, UseYourHead works in conjunction with a USB PC camera to detect players' head movements and translate them into game commands, augmenting mouse clicks and keystrokes.
In 2001, Cybernet created NaviGaze, which is capable of tracking a person’s head and detecting eye blinks. The head movements are used to control the mouse cursor and the eye blinks are used to produce mouse clicks. The system was designed for use by seriously disabled individuals (who had no motor control of their hands).