ABSTRACT


The rise of consumer-focused Artificial Intelligence (AI) applications like ChatGPT, DALL-E 2, Stable Diffusion, and Jasper AI has made these technologies more accessible to the public. Yet, despite using these applications regularly, most people still find the underlying technology complex and mysterious and lack a clear understanding of how these systems work. Explainable Artificial Intelligence (XAI) aims to address this issue by developing explainable models. However, current XAI systems primarily cater to technical users with prior knowledge of the field, leaving non-expert users empty-handed. The current study investigates the introduction of Interactive Installation Art (IIA) into the design of XAI systems as a prospective solution to enhance initial user engagement and increase accessibility for non-expert users, by allowing them to interact with the properties of the system in a playful and informal manner.

The study presents the IIA for XAI (IIA4XAI) framework that serves as a foundation for utilizing IIA as a design technique in the development of XAI systems. It offers valuable guidance to designers and engineers
who aim to involve non-expert users as part of their target audience. The framework is based on an exploration of XAI and IIA theories and a preliminary study conducted through a survey. To evaluate the framework, a case study was performed, developing an interactive art installation built around AI image generation technology. Qualitative user tests were conducted on the installation prototype, demonstrating the effectiveness of the IIA4XAI framework in engaging non-expert users with AI systems. Overall, this study highlights the potential of the IIA4XAI framework as a design technique to make AI systems more understandable and accessible to a broader audience of non-expert users.