Abstract:
This paper studies the task of commonsense inference, especially natural language inference (NLI) and causal inference (CI), requiring knowledge beyond what is stated in ...Show MoreMetadata
Abstract:
This paper studies the task of commonsense inference, especially natural language inference (NLI) and causal inference (CI), requiring knowledge beyond what is stated in the input sentences. State-of-the-arts have been neural models powered with knowledge or contextual embeddings, for example BERT, as commonsenses knowledge. Our research questions are thus: Is BERT all we need for NLI and CI? If not, what is missing information and where to find such information? While many work has studied what is captured in BERT, the limitation of BERT is rather under-studied. Our contribution is observing the limitations of BERT in commonsense inference, then leveraging complementary resources containing missing information. Specifically, we model BERT and complementary resource as two heterogeneous modalities, and explore the pros and cons of multimodal integration approaches. We demonstrate that our proposed integration models achieve the state-of-the-art performance on both NLI and CI tasks.
Published in: ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 04-08 May 2020
Date Added to IEEE Xplore: 09 April 2020
ISBN Information: