Skip to Main Content
A dynamic quasi-Newton method for uncalibrated, vision-guided robotic tracking control with fixed imaging is developed and demonstrated. This method does not require calibrated kinematic and camera models. Robotic control is achieved at each step through minimizing a nonlinear objective function, by taking quasi-Newton steps and estimating the composite Jacobian at each step. The Jacobian is estimated using a dynamic recursive least-squares algorithm. Experimental results demonstrate the validity of this approach.