In this paper we present an innovative method for counting people from zenithal mounted cameras. The proposed method is designed to be computationally efficient and able to provide accurate counting under different realistic conditions. The method can operate with traditional surveillance cameras or with depth imaging sensors. The validation has been carried out on a significant dataset of images that has been specifically devised and collected in order to account for the main factors that may impact on the counting accuracy and, in particular, the acquisition technology (traditional RGB camera and depth sensor), the installation scenario (indoor and outdoor), the density of the people flow (isolated people and groups of persons). Results confirm that the method can achieve an accuracy ranging between 90% and 98% depending on the adopted sensor technology and on the complexity of the scenario.

A versatile and effective method for counting people on either RGB or depth overhead cameras

DEL PIZZO, LUCA;FOGGIA, PASQUALE;GRECO , ANTONIO;PERCANNELLA, Gennaro;VENTO, Mario
2015-01-01

Abstract

In this paper we present an innovative method for counting people from zenithal mounted cameras. The proposed method is designed to be computationally efficient and able to provide accurate counting under different realistic conditions. The method can operate with traditional surveillance cameras or with depth imaging sensors. The validation has been carried out on a significant dataset of images that has been specifically devised and collected in order to account for the main factors that may impact on the counting accuracy and, in particular, the acquisition technology (traditional RGB camera and depth sensor), the installation scenario (indoor and outdoor), the density of the people flow (isolated people and groups of persons). Results confirm that the method can achieve an accuracy ranging between 90% and 98% depending on the adopted sensor technology and on the complexity of the scenario.
2015
978-1-4799-7079-7
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4651948
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 30
  • ???jsp.display-item.citation.isi??? 0
social impact