A gesture recognition interface for use in controlling self-service machines
and
other devices is disclosed. A gesture is defined as motions and kinematic poses
generated by humans, animals, or machines. Specific body features are tracked,
and static and motion gestures are interpreted. Motion gestures are defined as
a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters
dynamic system with added geometric constraints to allow for real-time recognition
using a small amount of memory and processing time. A linear least squares method
is preferably used to determine the parameters which represent each gesture. Feature
position measure is used in conjunction with a bank of predictor bins seeded with
the gesture parameters, and the system determines which bin best fits the observed
motion. Recognizing static pose gestures is preferably performed by localizing
the body/object from the rest of the image, describing that object, and identifying
that description. The disclosure details methods for gesture recognition, as well
as the overall architecture for using gesture recognition to control of devices,
including self-service machines.