Loading [MathJax]/extensions/MathMenu.js
AI Hallucinations: A Misnomer Worth Clarifying | IEEE Conference Publication | IEEE Xplore

AI Hallucinations: A Misnomer Worth Clarifying


Abstract:

As large language models continue to advance in Artificial Intelligence (AI), text generation systems have been shown to suffer from a problematic phenomenon often termed...Show More

Abstract:

As large language models continue to advance in Artificial Intelligence (AI), text generation systems have been shown to suffer from a problematic phenomenon often termed as "hallucination." However, with AI’s increasing presence across various domains, including medicine, concerns have arisen regarding the use of the term itself. In this study, we conducted a systematic review to identify papers defining "AI hallucination" across fourteen databases. We present and analyze definitions obtained from all databases, categorize them based on their applications, and extract key points within each category. Our results highlight a lack of consistency in how the term is used, but also help identify several alternative terms in the literature. We discuss the implications of these findings and call for a more unified effort to bring consistency to this important contemporary AI issue, which can significantly affect multiple domains.
Date of Conference: 25-27 June 2024
Date Added to IEEE Xplore: 30 July 2024
ISBN Information:
Conference Location: Singapore, Singapore

Contact IEEE to Subscribe

References

References is not available for this document.