Loading [MathJax]/extensions/MathMenu.js
Exploring Explainability: A Definition, a Model, and a Knowledge Catalogue | IEEE Conference Publication | IEEE Xplore

Exploring Explainability: A Definition, a Model, and a Knowledge Catalogue


Abstract:

The growing complexity of software systems and the influence of software-supported decisions in our society awoke the need for software that is transparent, accountable, ...Show More

Abstract:

The growing complexity of software systems and the influence of software-supported decisions in our society awoke the need for software that is transparent, accountable, and trust-worthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. However, in order to incorporate this NFR into systems, we need to understand what explainability means from a software engineering perspective and how it impacts other quality aspects in a system. This allows for an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. Nevertheless, explainability is currently under-researched in the domain of requirements engineering and there is a lack of conceptual models and knowledge catalogues that support the requirements engineering process and system design. In this work, we bridge this gap by proposing a definition, a model, and a catalogue for explainability. They illustrate how explainability interacts with other quality aspects and how it may impact various quality dimensions of a system. To this end, we conducted an interdisciplinary Systematic Literature Review and validated our findings with experts in workshops.
Date of Conference: 20-24 September 2021
Date Added to IEEE Xplore: 18 November 2021
ISBN Information:

ISSN Information:

Conference Location: Notre Dame, IN, USA

I. Introduction

We live in the age of artificial intelligence (AI). Software decision-making has spread from simple daily decisions, such as the choice of a navigation route, to more critical ones, such as the diagnosis of cancer patients [2]. Systems have been strongly influencing various aspects of our lives with their outputs but can be as mysterious as black boxes to us [3]. This ubiquitous influence of black-box systems has induced discussions about the transparency and ethics of modern systems [4]. Responsible collection and use of data, privacy, and safety are just a few among many concerns. It is crucial to understand how to incorporate these concerns into systems and, thus, how to deal with them during requirements engineering (RE).

Contact IEEE to Subscribe

References

References is not available for this document.