Explaining Automated Vehicle Behavior at an Appropriate Abstraction Level and Timescale to Maintain Common Ground
Automation is becoming increasingly complex, playing a larger role in driving and expanding its operational design domain to dynamic urban roads. Explainable AI (XAI) research in computer science aims to craft explanations of automation that help people understand the behavior of complex algorithms. However, many XAI approaches rely on fixed-format explanations, which may not effectively support drivers with varying levels of automation knowledge and tasks with different timescales. Maintaining common ground is a multilevel process, in which individuals and automation must adjust communication format and abstraction based on knowledge and time constraints. We first draw on existing research to suggest that common ground is a shared understanding between drivers and automation that requires constant maintenance. We applied the abstraction hierarchy (AH) modeling method, which describes complex systems across multiple abstraction levels to match drivers’ cognitive capacity. We modified it to translate vehicle and traffic data into multilevel explanations of automation behavior. We expanded the model into the abstraction–decomposition space, naming it the Driver–Automation Teaming model, designed to generate explanations that account for task timescale. With this modified model, we developed three human–machine interface concepts to demonstrate how it can improve XAI’s support for driver–automation collaboration.