By Topic

Robust Cooperative Exploration With a Switching Strategy

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Wencen Wu ; Sch. of Electr. & Comput. Eng., Georgia Inst. of Technol., Atlanta, GA, USA ; Fumin Zhang

Biological inspirations have lead us to develop a switching strategy for a group of robotic sensing agents searching for a local minimum of an unknown noisy scalar field. Starting with individual exploration, the agents switch to cooperative exploration only when they are not able to converge to a local minimum at a satisfying rate. We derive a cooperative H filter that provides estimates of field values and field gradients during cooperative exploration and give sufficient conditions for the convergence and feasibility of the filter. The switched behavior from individual exploration to cooperative exploration results in faster convergence, which is rigorously justified by the Razumikhin theorem, to a local minimum. We propose that the switching condition from cooperative exploration to individual exploration is triggered by a significantly improved signal-to-noise ratio (SNR) during cooperative exploration. In addition to theoretical and simulation studies, we develop a multiagent testbed and implement the switching strategy in a lab environment. We have observed consistency of theoretical predictions and experimental results, which are robust to unknown noises and communication delays.

Published in:

Robotics, IEEE Transactions on  (Volume:28 ,  Issue: 4 )