Unstructured human environments present a substantial challenge to effective robotic operation. Mobile manipulation in human environments requires dealing with novel unknown objects, cluttered workspaces, and noisy sensor data. We present an approach to mobile pick and place in such environments using a combination of two-dimensional (2-D) and three-dimensional (3-D) visual processing, tactile and proprioceptive sensor data, fast motion planning, reactive control and monitoring, and reactive grasping. We demonstrate our approach by using a two-arm mobile manipulation system to pick and place objects. Reactive components allow our system to account for uncertainty arising from noisy sensors, inaccurate perception (e.g., object detection or registration), or dynamic changes in the environment. We also present a set of tools that allows our system to be easily configured within a short time for a new robotic system.