Development of an Adaptable and Customizable Evaluation Platform for the Study of Emotional Behavior in Situations of Multiple Solicitations

By Régis Mollard, Marion Wolff, Nadine Couture, Alexis Clay
English

This paper presents a platform for the evaluation of user experience pertaining to innovative concepts in evolutionary socio-technical systems. Developing a system involves performing formative evaluations (during the design phase) and summative evaluations (at the end of the design phase). Designing or redesigning systems is creating new human-human interactions and human-system interactions through technological innovations, but also creating new uses. Anticipating these uses and evaluating these interactions requires setting up experiments and tests using methods and tools to capture, analyze, and interpret the activities and behaviors of individuals while they are interacting. Beyond the use (utility, usability, sense), the emotion of users may be a factor in the performance of a system. The platform combines prototype methods and tools. In this paper, we present a method for assessing the emotional processes of subjects faced with multiple solicitations during the performance of a monitoring task. Through a usage scenario, this paper also presents the evaluation protocol, and the methods chosen for their relevance to the situation. The protocol used is based on an attentional test based on the MultiAttribute Task (MAT), in which subjects are being diverted from their main task by agents/stressors/distracters. With the exception of heart rate, the parameters used to characterize the subject?s behavior, both in terms of performance and in terms of emotional reactions, are sensitive to the effect of task complexity. This was also the case for subjective assessments (difficulty of the task, self-assessment of performance, emotions, etc.). Using multivariate analysis (Principal Component Analysis) it has been possible through the joint analysis of performance and verbalizations to identify individual strategies depending on the type of distractor that the subject should manage. The emotional response is evaluated from different angles with a gradient of difficulty. It can be interpreted according to the usage scenario, and it can be related to the level of performance. The platform and the associated method described in this paper highlight the importance of timing of a maximum means in real time, while making choices to limit the intrusiveness of the tools used, and to select appropriate methods for performing statistical analysis.

Keywords

  • Use-Test
  • multiple solicitations
  • evaluation
  • interactive behavior
  • multivariate analysis