Wearable devices and smart sensors are increasingly adopted to monitor the behaviors of human and artificial agents. Many applications rely on the capability of such devices to recognize daily life activities performed by the monitored users in order to tailor their behaviors with respect to the occurring situations. Despite the constant evolution of smart sensing technologies and the numerous research in this field, an accurate recognition of in-the-wild situations still represents an open research challenge. This work proposes a novel approach for situation identification capable of recognizing the activities and the situations in which they occur in different environments and behavioral contexts, processing data acquired by wearable and environmental sensors. An architecture of a situation-aware wearable computing system is proposed, inspired by Endsley's situation-awareness model, consisting of a two-step approach for situation identification. The approach first identifies the daily life activities via a learning-based technique. Simultaneously, the context in which the activities are performed is recognized using Context Space Theory. Finally, the fusion between the context state and the activities allows identifying the complex situations in which the user is acting. The knowledge regarding the situations forms the basis on which novel and smarter applications can be realized. The approach has been evaluated on the ExtraSensory public dataset and compared with state-of-the-art techniques, achieving an accuracy of 96% for the recognition of situations and with significantly low computational time, demonstrating the efficacy of the two-step situation identification approach.

Situation identification in smart wearable computing systems based on machine learning and Context Space Theory

D'Aniello G.
;
Gaeta M.;Rehman Z. U.;
2024-01-01

Abstract

Wearable devices and smart sensors are increasingly adopted to monitor the behaviors of human and artificial agents. Many applications rely on the capability of such devices to recognize daily life activities performed by the monitored users in order to tailor their behaviors with respect to the occurring situations. Despite the constant evolution of smart sensing technologies and the numerous research in this field, an accurate recognition of in-the-wild situations still represents an open research challenge. This work proposes a novel approach for situation identification capable of recognizing the activities and the situations in which they occur in different environments and behavioral contexts, processing data acquired by wearable and environmental sensors. An architecture of a situation-aware wearable computing system is proposed, inspired by Endsley's situation-awareness model, consisting of a two-step approach for situation identification. The approach first identifies the daily life activities via a learning-based technique. Simultaneously, the context in which the activities are performed is recognized using Context Space Theory. Finally, the fusion between the context state and the activities allows identifying the complex situations in which the user is acting. The knowledge regarding the situations forms the basis on which novel and smarter applications can be realized. The approach has been evaluated on the ExtraSensory public dataset and compared with state-of-the-art techniques, achieving an accuracy of 96% for the recognition of situations and with significantly low computational time, demonstrating the efficacy of the two-step situation identification approach.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4853272
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 0
social impact