With the Facial Control for Electric Guitar project, David Merrill mapped the output of a real-time face-tracker onto the parameters of an audio effects processor.
A modified real-time head-tracker communicates via a TCP/IP socket connection to a custom server program. The server manages the mapping of sensed gesture onto control messages, which it sends to a guitar effects processor via MIDI messages.
Pupil positions and sizes are tracked using a difference images, and eyes/eyebrows are tracked using templates. Detection of head nods, head shakes, and eye blinks is also implemented.