We offer a novel representation scheme for view-based motion analysis using just the change in the relational statistics among the detected image features, without the need for object models, perfect segmentation, or part-level tracking. We model the relational statistics using the probability that a random group of features in an image would exhibit a particular relation. To reduce the representational combinatorics of these relational distributions, we represent them in a Space of Probability Functions (SoPF), where the Euclidean distance is related to the Bhattacharya distance between probability functions. Different motion types sweep out different traces in this space. We demonstrate and evaluate the effectiveness of this representation in the context of recognizing persons from gait. In particular, on outdoor sequences: (1) we demonstrate the possibility of recognizing persons from not only walking gait, but running and jogging gaits as well; (2) we study recognition robustness with respect to view-point variation; and (3) we benchmark the recognition performance on a database of 71 subjects walking on soft grass surface, where we achieve around 90 percent recognition rates in the presence of viewpoint variation.