Explainable Artificial Intelligence improves human decision-making: Results from a mushroom picking experiment at a public art festival

HOXAIArsStudy Teaser

Abstract

Explainable Artificial Intelligence (XAI) enables Artificial Intelligence (AI) to explain its decisions. This holds the promise of making AI more understandable to users, improving interaction, and establishing an adequate level of trust. We tested this claim in the high-risk task of AI-assisted mushroom hunting, where people had to decide whether a mushroom was edible or poisonous. In a between-subjects experiment, 328 visitors of an Austrian media art festival played a tablet-based mushroom hunting game while walking through a highly immersive artificial indoor forest. As part of the game, an artificially intelligent app analyzed photos of the mushrooms they found and recommended classifications. One group saw the AI’s decisions only, while a second group additionally received attribution-based and example-based visual explanations of the AI’s recommendation. The results show that participants with visual explanations outperformed participants without explanations in correct edibility assessments and pick-up decisions. This exhibition-based experiment thus replicated the decision-making results of a previous online study. However, unlike in the previous study, the visual explanations did not significantly affect levels of trust or acceptance measures. In a direct comparison, we consequently discuss the findings in terms of generalizability. Besides the scientific contribution, we discuss the direct impact of conducting XAI experiments in immersive art- and game-based environments in exhibition contexts on visitors and local communities by triggering reflection and awareness for psychological issues of human–AI interaction.


Citation

Benedikt Leichtmann, Andreas Hinterreiter, Christina Humer, Marc Streit, Martina Mara
Explainable Artificial Intelligence improves human decision-making: Results from a mushroom picking experiment at a public art festival
International Journal of Human–Computer Interaction, doi:10.1080/10447318.2023.2221605, 2023.

BibTeX

@article{,
    title = {Explainable Artificial Intelligence improves human decision-making: Results from a mushroom picking experiment at a public art festival},
    author = {Benedikt Leichtmann and Andreas Hinterreiter and Christina Humer and Marc Streit and Martina Mara},
    journal = {International Journal of Human–Computer Interaction},
    publisher = {Taylor & Francis},
    doi = {10.1080/10447318.2023.2221605},
    month = {June},
    year = {2023}
}

Acknowledgements

We thank Birke van Maartens for the artistic concept of the artificial forest and the scenery design, Nives Meloni for coordinating the exhibition area, Leonie Haasler and Gabriel Vitel for the construction of the artificial forest (stage building), Moritz Heckmann for helping to implement the game, Kenji Tanaka for the sound design, and Stefan Eibelwimmer for the graphic design of the tablet game and Christopher Lindinger who helped in the conceptualization of the game. Additionally, we thank the Johannes Kepler University press team and Roman Peherstorfer and his team for the video documentation of the installation. Furthermore, we also thank all the student assistants and colleagues from the Robopsychology Lab at Johannes Kepler University Linz who actively supported the installation and data collection and who also spontaneously stepped in to help when the number of visitors was high. Finally, we thank Dr. Otto Stoik and the members of the Mycological Working Group (MYAG) at the Biology Center Linz, Austria, who supported us in the development of items for the mushroom knowledge test and provided mushroom images for this study. We also thank the members of the German Mycological Society (DGfM) for providing additional images.