Exploring the Impact of Explanation Representation on User Satisfaction in Robot Navigation
The decisions made by autonomous robots hold substantial influence over how humans perceive their behavior. One way to alleviate potential negative impressions of such decisions by humans and enhance human comprehension of them is through explaining. We introduce visual and textual explanations integrated into robot navigation, considering the surrounding environmental context. To gauge the effectiveness of our approach, we conducted a comprehensive user study, assessing user satisfaction across different forms of explanation representation. Our empirical findings reveal a notable discrepancy in user satisfaction, with significantly higher levels observed for explanations that adopt a multimodal format, as opposed to those relying solely on unimodal representations.