Abstract:
The control of high degree of freedom prosthetic hands requires high cognitive effort and has proven difficult to navigate particularly as prosthetic hands become more ad...Show MoreMetadata
Abstract:
The control of high degree of freedom prosthetic hands requires high cognitive effort and has proven difficult to navigate particularly as prosthetic hands become more advanced. To reduce control complexity, vision-based shared control methods have been proposed. Such methods rely on image processing and/or machine learning to identify object features and classes and introduce a level of autonomy to the grasping process. However, currently available image datasets lack focus in the area of prosthetic grasping. Thus, in this paper, we present a new dataset capturing user interactions with a wide variety of everyday life objects using a fully actuated, human-like robot hand and an onboard camera. The dataset includes videos of over 100 grasps of 50 objects from 35 classes. A video analysis has been conducted, using established grasp taxonomies to compile a list of grasp types based on interactions of the user with the objects. The resulting dataset can be used to develop more efficient prosthetic hand operation systems based on shared control frameworks.
Published in: 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
Date of Conference: 15-19 July 2024
Date Added to IEEE Xplore: 17 December 2024
ISBN Information:
ISSN Information:
PubMed ID: 40039393
References is not available for this document.
Select All
1.
“Össur Touch Solutions. Ossur.com — ossur.com,” https://www.ossur.com/en-us/prosthetics/touch-solutions, [ Accessed 25-01-2024 ].
2.
“Michelangelo hand — The Michelangelo hand helps you regain extensive freedom — ottobock.com,” https://www.ottobock.com/en-us/product/8E500, [ Accessed 25-01-2024 ].
3.
A. Dwivedi, Y. Kwon, A. J. McDaid, and M. Liarokapis, “A learning scheme for emg based decoding of dexterous, in-hand manipulation motions,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 27, no. 10, pp. 2205–2215, 2019.
4.
A. Dwivedi, D. Shieff, A. Turner, G. Gorjup, Y. Kwon, and M. Liarokapis, “A shared control framework for robotic telemanipulation combining electromyography based motion estimation and compliance control,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 9467–9473.
5.
S. Došen, C. Cipriani, M. Kostić, M. Controzzi, M. C. Carrozza, and D. B. Popović, “Cognitive vision system for control of dexterous prosthetic hands: experimental evaluation,” Journal of neuroengineering and rehabilitation, vol. 7, no. 1, pp. 1–14, 2010.
6.
M. Markovic, S. Dosen, C. Cipriani, D. Popovic, and D. Farina, “Stereovision and augmented reality for closed-loop control of grasping in hand prostheses,” Journal of neural engineering, 2014.
7.
T. C. Hansen, M. A. Trout, J. L. Segil, D. J. Warren, and J. A. George, “A bionic hand for semi-autonomous fragile object manipulation via proximity and pressure sensors,” in 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). IEEE, 2021, pp. 6465–6469.
8.
C. Cipriani, F. Zaccone, S. Micera, and M. C. Carrozza, “On the shared control of an emg-controlled prosthetic hand: Analysis of user–prosthesis interaction,” IEEE Transactions on Robotics, 2008.
9.
J. Gonzalez-Vargas, S. Dosen, S. Amsuess, W. Yu, and D. Farina, “Human-machine interface for the control of multi-function systems based on electrocutaneous menu: application to multi-grasp prosthetic hands,” PloS one, vol. 10, no. 6, p. e0127528, 2015.
10.
R. V. Godoy, B. Guan, A. Dwivedi, and M. Liarokapis, “An affordances and electromyography based telemanipulation framework for control of robotic arm-hand systems,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2023.
11.
T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, and C. L. Zitnick, “Microsoft coco: Common objects in context,” in Computer Vision–ECCV 2014: 13th European Conference, Switzerland, September 6-12, 2014, Proceedings. Springer, 2014.
12.
A. Saudabayev, Z. Rysbek, R. Khassenova, and H. A. Varol, “Human grasping database for activities of daily living with depth, color and kinematic data streams,” Scientific data, vol. 5, no. 1, pp. 1–13, 2018.
13.
L. T. Taverne, M. Cognolato, T. Bützer, R. Gassert, and O. Hilliges, “Video-based prediction of hand-grasp preshaping with application to prosthesis control,” in 2019 International Conference on Robotics and Automation (ICRA). IEEE, 2019, pp. 4975–4982.
14.
F. Hundhausen, D. Megerle, and T. Asfour, “Resource-aware object classification and segmentation for semi-autonomous grasping with prosthetic hands,” in 2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids). IEEE, 2019, pp. 215–221.
15.
B. Calli, A. Singh, A. Walsman, S. Srinivasa, P. Abbeel, and A. M. Dollar, “The ycb object and model set: Towards common benchmarks for manipulation research,” in 2015 international conference on advanced robotics (ICAR). IEEE, 2015, pp. 510–517.
16.
J. Chapman, A. Dwivedi, and M. Liarokapis, “A dexterous, adaptive, affordable, humanlike robot hand: Towards prostheses with dexterous manipulation capabilities,” in 2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids), 2022, pp. 337–343.
17.
D. A. Abbink, T. Carlson, M. Mulder, J. C. De Winter, F. Amin-ravan, T. L. Gibo, and E. R. Boer, “A topology of shared control systems—finding common ground in diversity,” IEEE Transactions on Human-Machine Systems, vol. 48, no. 5, pp. 509–525, 2018.
18.
F. Sanches, G. Gao, N. Elangovan, R. V. Godoy, J. Chapman, K. Wang, P. Jarvis, and M. Liarokapis, “Scalable. intuitive human to robot skill transfer with wearable human machine interfaces: On complex, dexterous tasks,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023, pp. 6318–6325.
19.
M. Cutkosky, “On grasp choice, grasp models, and the design of hands for manufacturing tasks,” IEEE Transactions on Robotics and Automation, vol. 5, no. 3, pp. 269–279, 1989.
20.
M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “ROS: an open-source Robot Operating System,” in ICRA Workshop on Open Source Software, 2009.
21.
J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.