Time series are often an unknown or hard to interpret data domain by themselves. Often even a time series visualization does not include all information needed to understand trends and patterns. Explaining model decisions on such non-intelligible data introduces another level of uncertainty. Current explanations mostly consist of heatmaps, however, do experts (and even normal users) really understand such heatmaps? In this project, you design, implement, and do a study for time series explanations based on concepts from the literature.
- What do users really need to get a solid explanation from a time series model in their visualizations?
- What is the focus of different user groups in explanations?
- How can we support users best with proper explanations?
- Search through the XAI literature on time series and collect explanation tactics.
- Implement an online tool for a user study on the previous explanation tactics.
- Do the user study and analyse the data.
- Scope: Bachelor/Master
- 3 Month Project, 3/6 Month Thesis
- Start: immediately
Jeyakumar, J. V., Noor, J., Cheng, Y. H., Garcia, L., & Srivastava, M. (2020). How can i explain this to you? an empirical study of deep neural network explanation methods. Advances in Neural Information Processing Systems.