Doctoral thesis

Sensor-based recognition of engagement during work and learning activities


188 p

Thèse de doctorat: Università della Svizzera italiana, 2021

English Personal computing systems like e.g., laptop, smartphone, and smartwatches are nowadays ubiquitous in people's everyday life. People use such systems not only for communicating or searching for information, but also as digital companions, able to track and support their daily activities such as sleep, food intake, physical exercise and even work. Sensors embedded in personal computing systems enable the continuous collection of heterogeneous data about their users. Location, heart rate and more, can nowadays be measured reliably using such sensors. Processed with machine learning and data analytics techniques, sensors' data can be used to infer users' information such as the type of activity, the behaviour and even the affective states. In this thesis, we investigate the feasibility of using data derived from personal devices to automatically recognize the affective state of engagement as it occurs during daily activities. We focus on the specific use cases of inferring students' engagement during learning activities and knowledge workers' engagement during work activities. Engagement, generally considered in terms of the emotional and attentional involvement into an activity, is a well-known predictor of learning outcomes and job performance. Consequently, engagement-aware systems able to sense, recognize and promote engagement have a huge potential for improving the learning and work experience. Measuring engagement has been for years central focus of research in psychology. Traditional methods, such as self-reports and observations, requiring significant manual effort from researchers and study participants, have been used for years to derive knowledge about engagement. Bulky devices measuring physiological parameters e.g., electrodermal activity and heart rate variability, have been also used to study engagement from a physiological perspective mostly in laboratory settings or during pre- defined activities. Today, taking advantage of the availability of personal devices and the sensors they are equipped with, computer science researchers are investigating methods for automatically measuring engagement in everyday activities, with little or no effort from users. Despite the knowledge gained from years of research on engagement, its automatic assessment using sensor data is a challenging goal. Indeed, there is no a pre-defined mapping between sensor data and engagement, and it is not clear what transformation and combination of data can provide a reliable engagement assessment. Furthermore, engagement definitions and its expressions are context- dependent, thus a system aiming to infer engagement should be able to retrieve and use information about the user's context. However, in the work environment, context information such as the type of activity, are difficult to infer. People use several tools to perform their tasks, work in different locations, alone and with others, making the activity inference challenging. In this thesis, we target two main problems: (1) the engagement recognition problem; and (2) the activity recognition problem. To evaluate our approaches we designed and ran three user studies and collected data in laboratory settings and in the in-the-wild e.g., during lectures in the classroom and during actual work days. Further, we performed an extensive data analysis. Specifically, we first address the sensor data transformation and combination problem for engagement recognition. To this end, in the first study presented in this thesis, we leveraged electrodermal activity data and proposed a method for translating findings from educational research into sensor data representation, i.e., features. We then used the features in input to machine learning algorithms with the aim of recognizing students engagement during lectures. In the second study, we proposed a novel method to recognize a behavioural expression, i.e., laughter, that can be used for recognizing engagement. We leveraged typical physiological and body movement reactions of laughter, and quantify them using sensor data gathered from wristbands. In the third study, we investigated sensorfusion strategies based on traditional machine learning and deep learning, and combined physiological data i.e., electrodermal activity and cardiac activity, together with context information to recognize engagement during work activities. Second, we address the problem of recognizing activities in the workplace. To this end, we proposed a method to combine behavioral expressions such as physiological activation, physical movement, laptop and phone usage. We performed a thorough analysis and investigated which type of device and sensor data bring relevant information, especially for distinguishing between work and break activities. We believe that the insights and technical contributions of this thesis aim to enable the design and development of engagement-aware systems able to support people during their daily activities.
  • English
Computer science and technology
License undefined
Persistent URL

Document views: 200 File downloads:
  • 2021INFO011.pdf: 203