ONFIRE Contest 2023 is a competition, organized within ICIAP 2023 conference, among methods based on deep learning, aimed at the recognition of fire from videos in real-time on edge devices. This topic is inspiring various research groups for the underlying security reasons and for the growing necessity to realize a system that allows to safeguard the territory from the enormous damage that fires can cause. The participants are required to design fire detection methods, starting from a training set that consists of videos in which fire (flames and/or smoke) is present (positive samples), and others (negative samples) that do not contain a fire. The videos have been collected from existing datasets by selecting as positive videos only those that really frame a fire and not flames and smoke in controlled conditions, and as negative videos the ones that contain moving objects that can be confused with flames or smoke. Since the videos are collected in different conditions, the dataset is very heterogeneous in terms of image resolution, illumination, pixel size of flame or smoke, background activity, scenario (urban or wildfire). The submitted methods are evaluated over a private test set, whose videos are different from the ones available in the training set; this choice allows to test the approaches in realistic conditions, namely in unknown operative scenarios. The proposed experimental protocol allows to measure not only the accuracy but also the computational resources required by the methods, so that the top-rank approaches will be both effective and suited for real-time processing on the edge.

ONFIRE Contest 2023: Real-Time Fire Detection on the Edge

Gragnaniello D.;Greco A.;
2024-01-01

Abstract

ONFIRE Contest 2023 is a competition, organized within ICIAP 2023 conference, among methods based on deep learning, aimed at the recognition of fire from videos in real-time on edge devices. This topic is inspiring various research groups for the underlying security reasons and for the growing necessity to realize a system that allows to safeguard the territory from the enormous damage that fires can cause. The participants are required to design fire detection methods, starting from a training set that consists of videos in which fire (flames and/or smoke) is present (positive samples), and others (negative samples) that do not contain a fire. The videos have been collected from existing datasets by selecting as positive videos only those that really frame a fire and not flames and smoke in controlled conditions, and as negative videos the ones that contain moving objects that can be confused with flames or smoke. Since the videos are collected in different conditions, the dataset is very heterogeneous in terms of image resolution, illumination, pixel size of flame or smoke, background activity, scenario (urban or wildfire). The submitted methods are evaluated over a private test set, whose videos are different from the ones available in the training set; this choice allows to test the approaches in realistic conditions, namely in unknown operative scenarios. The proposed experimental protocol allows to measure not only the accuracy but also the computational resources required by the methods, so that the top-rank approaches will be both effective and suited for real-time processing on the edge.
2024
978-3-031-51022-9
978-3-031-51023-6
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4856300
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact