Loading [MathJax]/extensions/MathMenu.js
Smooth Human-Robot Shared Control for Autonomous Orchard Monitoring with UGVs | IEEE Journals & Magazine | IEEE Xplore

Abstract:

Precision agriculture offers the opportunity to automate routine or difficult tasks in orchards and vineyards, such as spraying or inspection, with Unmanned Ground Vehicl...Show More

Abstract:

Precision agriculture offers the opportunity to automate routine or difficult tasks in orchards and vineyards, such as spraying or inspection, with Unmanned Ground Vehicles (UGV). In this context, human operators should be kept in the closedloop control of the robot for safety and reliability. This work is motivated by the challenges of effectively deploying human-robot shared control in the field. First, an asymptotically stable controller keeps the robot on the desired trajectory between rows of trees, whose distance is on the order of the robot’s width. Second, the robot must efficiently avoid static and moving obstacles on its path. Third, the control inputs must not exceed the actuator limits, which can degrade trajectory tracking performance, cause instability, or damage critical hardware. Finally, in real-life scenarios, user intervention is sometimes required to manage unpredictable situations. To overcome these challenges, we propose and deploy a shared controller that continuously and smoothly varies the ratio of human and automatic control inputs depending on the human’s intent, geometrically rescales trajectory inputs to maintain bounded control, and incorporates obstacle avoidance capabilities – all while preserving asymptotic stability of the closed-loop system. Additionally, we introduce a time re-scaling strategy that modifies trajectory evolution, ensuring target positions remain within a defined vicinity of the robot. The system performance was assessed in simulation and in 26 field trials inside an apple orchard using different obstacle configurations, weather, and terrain conditions, with a success rate of 100% and an average tracking error of 0.1 m.
Page(s): 1 - 1
Date of Publication: 24 March 2025

ISSN Information:

Funding Agency: