Abstract:
As large language models continue to advance in Artificial Intelligence (AI), text generation systems have been shown to suffer from a problematic phenomenon often termed...Show MoreMetadata
Abstract:
As large language models continue to advance in Artificial Intelligence (AI), text generation systems have been shown to suffer from a problematic phenomenon often termed as "hallucination." However, with AI’s increasing presence across various domains, including medicine, concerns have arisen regarding the use of the term itself. In this study, we conducted a systematic review to identify papers defining "AI hallucination" across fourteen databases. We present and analyze definitions obtained from all databases, categorize them based on their applications, and extract key points within each category. Our results highlight a lack of consistency in how the term is used, but also help identify several alternative terms in the literature. We discuss the implications of these findings and call for a more unified effort to bring consistency to this important contemporary AI issue, which can significantly affect multiple domains.
Published in: 2024 IEEE Conference on Artificial Intelligence (CAI)
Date of Conference: 25-27 June 2024
Date Added to IEEE Xplore: 30 July 2024
ISBN Information: