Jens Schneider
ABSTRACT
ESRA – towards explicability
Shahana Mohammed Nuhu
College of Science and Engineering, Hamad Bin Khalifa University
Jens Schneider
College of Science and Engineering, Hamad Bin Khalifa University
Mowafa Househ
College of Science and Engineering, Hamad Bin Khalifa University
An attempt to interlace art, therapy, and technology; an area which is not much explored and researched on. It is surprising that we only find very little literature on this context, which undoubtedly has a lot of possibilities in research.
In today’s world of growing pandemonium and lack of time to perceive feelings, the significance of understanding the psyche of our people has become compelling. Especially, in this pandemic situation and with children, the mental trauma they may be dealing with may not be expressed obviously. This is where the significance of artwork and art therapy comes into the picture. By giving a ‘technology touch’ to the concept of art therapy, it can be promoted effectively and extensively among people.
Children, like adults, have a full spectrum of feelings. The world around them affects them emotionally – be it triggered by situational difficulty, impacted by family problems, traumatic events, over-stimulating content in media, dysfunction in the community, etc. But they might find it difficult to express their feelings verbally, maybe due to fear, or maybe due to an inability to express it. In such a situation, direct communication becomes difficult and artwork wins.
ESRA (Emotion Sensing Recognition) automatically assesses a photograph of a child-drawn picture to bridge the gap between a child, his caregiver, and the therapist. The caregiver can use the app to identify if there is the reason for concern in an understandable way: Our method produces results in descriptive, plain English that is easily understood by anyone. If parents see a red flag, they can initiate dialogue with the child as well as an art therapist. From the result, the Art Therapist also gets an idea of potential issues, and she/he can begin her/his analysis and therapy with additional knowledge. The key idea behind all this is explainable AI. The tool developed in this context is just that: a tool, not a diagnostic panacea, to help human life. Findings made by the machine are explained in layman’s terms – what are the observations? What does it imply? Why it is a problem? What is to be done? etc. instead of being an inexplicable black box displaying overwhelming amounts of the numeric results. Instead, the machine literally explains its finding with the end-user in mind, such that the end-user himself can make informed decisions whether to accept the results or not. It is, in fact, giving the power of decision back to the user, which is a major ethical concern in modern AI-based systems (autonomy).
BIO
Dr. Jens Schneider is an assistant professor in the Division of Information and Computing Technology, College of Science and Engineering, HBKU. He received his diploma in Computer Science from RWTH Aachen, Germany, in 2004 and his doctorate in Computer Science from Technical University Munich, Germany, in 2009. His main research interests include interactive and GPU-based Visualization, GPU-friendly Data Compression, Large-scale Visualization, and Level of Detail Algorithms. He has co-authored over 50 peer-reviewed papers, patents, and book chapters on topics ranging from visualization over IoT to health informatics.