Current applications of mobile robots in urban search and rescue (USAR) environments require a human operator in the loop to help guide the robot remotely. Although human operation can be effective, the unknown cluttered nature of the environments make robot navigation and victim identification highly challenging. Operators can become stressed and fatigued very quickly due to a loss of situational awareness, leading to the robots getting stuck and not being able to find victims in the scene during this time-sensitive operation. In addition, current autonomous robots are not capable of traversing these complex unpredictable environments. To address this challenge, a balance between the level of autonomy of the robot and the amount of human control over the robot needs to be addressed. In this paper, we present a unique control architecture for semi-autonomous navigation of a robotic platform utilizing sensory information provided by a novel real-time 3D mapping sensor. The control system provides the robot with the ability to learn and make decisions regarding which rescue tasks should be carried out at a given time and whether an autonomous robot or a human controlled robot can perform these tasks more efficiently without compromising the safety of the victims, rescue workers and the rescue robot. Preliminary experiments were conducted to evaluate the performance of the proposed collaborative control approach for a USAR robot in an unknown cluttered environment.