By Topic

Relative position estimation for AUVs by fusing bearing and inertial rate sensor measurements

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Huster, A. ; Aerosp. Robotics Lab., Stanford Univ., CA, USA ; Frew, E.W. ; Rock, S.M.

This paper describes a relative position sensing strategy that fuses monocular vision (a bearing measurement) with accelerometer and rate gyro measurements to generate an estimate of relative position between a free-floating underwater vehicle and a stationary object of interest. This type of position estimate is a core requirement for intervention-capable autonomous underwater vehicles. These vehicles can perform autonomous manipulation tasks, during which the vehicle needs to control its position relative to objects in its environment. For free-floating underwater vehicles, camera motion is generally unknown and must be estimated together with relative position. Various vision-only systems have been used to estimate relative position and camera motion, but these are difficult to implement in real underwater environments. The system we propose relies on vision to generate relative position information, but also fuses inertial rate sensors to reduce the amount of information that needs to be extracted from the vision system. The result is a system that potentially is simpler and more robust than a vision-only solution. However, the use of inertial rate sensors introduces several issues. The rate measurements are subject to biases, which need to be estimated to prevent the accumulation of unbounded drift when the measurements are integrated. This problem is non-linear, which presents several challenges in the estimator design. Finally, sufficient camera motion is required for the estimator to converge, which necessitates the design of a suitable trajectory. This paper discusses some of the implementation challenges, outlines an estimation algorithm that is uniquely adopted for this sensor fusion problem, develops a method to generate useful vehicle trajectories, and presents some results from laboratory experiments with a testbed manipulator system. For these experiments, the estimator was implemented as part of a closed-loop control system that can perform an object pick-up task.

Published in:

OCEANS '02 MTS/IEEE  (Volume:3 )

Date of Conference:

29-31 Oct. 2002