Skip to Main Content
This research investigates techniques that can improve the explanatory power of knowledge-based systems (KBS). Two types of enhancements are considered. First, deep explanations are textual descriptions that explain the underlying domain principles associated with system recommendations. Second, graphical hierarchies are used to represent the structure of the KBS rule base so that end-users can more easily visualize how a KBS reaches system recommendations. The experimental method was employed to investigate the effectiveness of the deep explanations and the hierarchic models. A 2×2 factorial design was used: one factor, interface type, is whether a subject interacted with a hierarchic interface or an informationally-equivalent flat interface; the second factor, explanations provision, is whether a subject was provided with deep explanation support or not. Multiple methods of measurements were employed to understand the effects of different treatments in the study. Problem-solving performance, as assessed by a series of tasks, and execution time were measured. Short answer questions were administered to all subjects to understand user preferences. Finally, to gain a richer understanding of problem-solving processes, all subject actions were captured in a computer log. From these logs, problem-solving strategies were reconstructed and analyzed. We found support for the benefits associated with interfaces having greater explanatory power according to the task-technology fit perspective: problem-solving performance was improved on certain tasks, and the correct problem-solving strategies were more likely to be adopted by subjects when they were provided with the appropriate support.
Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on (Volume:33 , Issue: 3 )
Date of Publication: May 2003