Loading [MathJax]/extensions/TeX/mhchem.js
Answering Binary Causal Questions Using Role-Oriented Concept Embedding | IEEE Journals & Magazine | IEEE Xplore

Answering Binary Causal Questions Using Role-Oriented Concept Embedding


Impact Statement:Understanding causality is crucial for automatic question-answering systems, which are useful in extracting and distributing human knowledge. An automatic question-answer...Show More

Abstract:

Answering binary causal questions is a challenging task, and it requires rich background knowledge to answer such questions. Extracting useful causal features from the ba...Show More
Impact Statement:
Understanding causality is crucial for automatic question-answering systems, which are useful in extracting and distributing human knowledge. An automatic question-answering system with causal knowledge can be used to check whether there is causal relationship between two concepts. Existing approaches to answer binary causal questions often answer such questions with close to 55% accuracy due to the limited usage of causal and contextual features. The deep learning framework we propose in this paper uses a role-oriented concept embedding to address such issues. Our approach achieves better accuracy by up to 3.6%, compared to the state-of-the-art benchmark approaches. The proposed approach can be used in a variety of fields, including prescriptive analysis, event prediction, and any other area where entity relationships are essential. It could also be used to improve the retrieval of causality-related inquiries in web search engines.

Abstract:

Answering binary causal questions is a challenging task, and it requires rich background knowledge to answer such questions. Extracting useful causal features from the background knowledge base and applying them effectively in a model is a crucial step to answering binary causal questions. The state-of-the-art approaches apply deep learning techniques to answer binary causal questions. In these approaches, candidate concepts are often embedded into vectors to model causal relationships among them. However, a concept may play the role of a cause in one question, but it could be an effect in another question. This aspect has not been extensively explored in existing approaches. Role-oriented causal concept embeddings are proposed in this article to model causality between concepts. We also propose leveraging semantic concept similarity to extract causal information from concepts. Finally, we develop a deep learning framework to answer binary causal questions. Our approach yields accuracy...
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 4, Issue: 6, December 2023)
Page(s): 1426 - 1436
Date of Publication: 05 September 2022
Electronic ISSN: 2691-4581

I. Introduction

Binary questions are frequently asked for the confirmation of given information. These questions can be answered by yes/no answers. Similarly, binary causal questions (BCQs) are asked to confirm whether or not a causal relationship exists between two candidate concepts. Fig. 1 depicts an example BCQ. In the example question, “Could COVID-19 cause lung failure?,” COVID-19 and lung failure are two candidate concepts, and the question asks whether or not there is a causal relationship between them.

Contact IEEE to Subscribe

References

References is not available for this document.