Internet of Medical Things (IoMT) and Machine Learning (ML) have become increasingly popular in healthcare. Wearable tiny medical devices can collect and transmit personal health-related data. However, applying ML-driven IoT in healthcare presents several challenges, especially in remote patient monitoring: (i) latency and privacy issues can hinder the transmission of sensitive medical data; (ii) it could require the correlation of multiple and heterogeneous data collected by different medical devices; and (iii) the limited resources of ultra-low-power wearable devices prevent the implementation of complex ML models directly on-board. On the other hand, Federated Learning (FL) and TinyML can address these limitations by enabling collaborative model training across distributed IoMT-edge resource-constrained microcontroller unit (MCU) based devices, allowing local data processing and improving latency and energy efficiency. However, traditional FL mainly handles unimodal data, limiting its direct applicability to several real-world IoT healthcare scenarios. This work proposes a Cross-Modal Federated TinyML (TinyCFL) implementation for MCU-based medical devices, which employs an intermediate multimodal distributed data fusion approach. Data from different modalities are independently processed on different tiny devices to extract features, which are then fused for healthcare tasks that require cross-modal reasoning. The proposed approach is evaluated by using the “UP-Fall detection” dataset, which is used in balanced, unbalanced, and Participant-Wise distribution scenarios. The proposed approach has been designed to be deployed in a distributed IoT-edge scenario. A prototype of the proposed TinyCFL approach based on resource-constrained MCUs has also been implemented.
Cross-Modal Federated TinyML for MCU-based Internet of Medical Things
Ibrar K.;Fusco P.;Rimoli G. P.;Palmieri F.;Ficco M.
2026
Abstract
Internet of Medical Things (IoMT) and Machine Learning (ML) have become increasingly popular in healthcare. Wearable tiny medical devices can collect and transmit personal health-related data. However, applying ML-driven IoT in healthcare presents several challenges, especially in remote patient monitoring: (i) latency and privacy issues can hinder the transmission of sensitive medical data; (ii) it could require the correlation of multiple and heterogeneous data collected by different medical devices; and (iii) the limited resources of ultra-low-power wearable devices prevent the implementation of complex ML models directly on-board. On the other hand, Federated Learning (FL) and TinyML can address these limitations by enabling collaborative model training across distributed IoMT-edge resource-constrained microcontroller unit (MCU) based devices, allowing local data processing and improving latency and energy efficiency. However, traditional FL mainly handles unimodal data, limiting its direct applicability to several real-world IoT healthcare scenarios. This work proposes a Cross-Modal Federated TinyML (TinyCFL) implementation for MCU-based medical devices, which employs an intermediate multimodal distributed data fusion approach. Data from different modalities are independently processed on different tiny devices to extract features, which are then fused for healthcare tasks that require cross-modal reasoning. The proposed approach is evaluated by using the “UP-Fall detection” dataset, which is used in balanced, unbalanced, and Participant-Wise distribution scenarios. The proposed approach has been designed to be deployed in a distributed IoT-edge scenario. A prototype of the proposed TinyCFL approach based on resource-constrained MCUs has also been implemented.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


