Abstract:
It has been discovered that the design of multimodal HMI interaction, which is based on emotion regulation, assists drivers in receiving information and improving their d...Show MoreMetadata
Abstract:
It has been discovered that the design of multimodal HMI interaction, which is based on emotion regulation, assists drivers in receiving information and improving their driving state. This design also provides them with multidimensional emotional experiences. The aim is to investigate the field of multimodal emotion recognition and the implementation of intelligent cockpit HMI interaction design. Commencing with the discrete emotion model, we examine, analyze, and formulate the design and strategic principles for multimodal interaction models in emotion regulation within driving scenarios, utilizing the information fusion hierarchy in multimodal emotion recognition. Furthermore, we propose interaction design solutions through additional research and demonstration of literature cases and user data. We explore the concepts and trends of multimodal interaction design for future human-computer interfaces, and strive to provide a more efficient, humanized, emotional, and immersive experience design research path and theoretical reference.
Published in: 2024 International Conference on Electronics and Devices, Computational Science (ICEDCS)
Date of Conference: 23-25 September 2024
Date Added to IEEE Xplore: 16 January 2025
ISBN Information: