Manifold learning is an important technique for effective nonlinear dimensionality reduction in machine learning. In this paper, we present a manifold-based framework for human activity recognition using wearable motion sensors. In our framework, we use locally linear embedding (LLE) to capture the intrinsic structure and build nonlinear manifolds for each activity. A nearest-neighbor interpolation technique is then applied to learn the mapping function from the input space to the manifold space. Finally, activity recognition is performed by comparing trajectories of different activity manifolds in the manifold space. Experimental results validate the effectiveness of our framework and demonstrate that manifold learning is promising for the task of human activity recognition using wearable motion sensors.