Skip to Main Content
We present a camera pointing system controlled by real-time calculations of sound source locations from a microphone array. Traditional audio localization techniques require explicit estimates of the spatial coordinates for each microphone in the array. In addition, positional information for the camera is needed to use such techniques to drive a camera pointing system. Sometimes this positioning can be done by hand, but for large aperture microphone arrays with many elements this is impractical. We show that in this setting, where elements are placed in an ad-hoc manner, explicitly learning the microphone positions is an unnecessary step. We give a calibration method whose focus is learning the mapping from time delays between pairs of microphones to the associated pan and tilt a PTZ-camera should be given to point at. This curtails the need to explicitly learn the microphone and camera positions. We use this method to calibrate a real-time camera pointing system used by the UCSD interactive display.