Skip to Main Content
Robots that interact with humans in everyday situations, need to be able to interpret the nonverbal social cues of their human interaction partners. We show that humans use body posture and head pose as social signals to initiate and terminate interaction when ordering drinks at a bar. For that, we record and analyze 108 interactions of humans interacting with a human bartender. Based on these findings, we train a Hidden Markov Model (HMM) using automatic body posture and head pose estimation. With this model, the bartender robot of the project JAMES can recognize typical social behaviors of human customers. Evaluation shows a recognition rate of 82.9 % for all implemented social behaviors and in particular a recognition rate of 91.2 % for bartender attention requests, which will allow the robot to interact with multiple humans in a robust and socially appropriate way.