In this paper we present an innovative method for counting people from zenithal mounted cameras. The proposed method is designed to be computationally efficient and able to provide accurate counting under different realistic conditions. The method can operate with traditional surveillance cameras or with depth imaging sensors. The validation has been carried out on a significant dataset of images that has been specifically devised and collected in order to account for the main factors that may impact on the counting accuracy and, in particular, the acquisition technology (traditional RGB camera and depth sensor), the installation scenario (indoor and outdoor), the density of the people flow (isolated people and groups of persons). Results confirm that the method can achieve an accuracy ranging between 90% and 98% depending on the adopted sensor technology and on the complexity of the scenario.
A versatile and effective method for counting people on either RGB or depth overhead cameras
DEL PIZZO, LUCA;FOGGIA, PASQUALE;GRECO , ANTONIO;PERCANNELLA, Gennaro;VENTO, Mario
2015-01-01
Abstract
In this paper we present an innovative method for counting people from zenithal mounted cameras. The proposed method is designed to be computationally efficient and able to provide accurate counting under different realistic conditions. The method can operate with traditional surveillance cameras or with depth imaging sensors. The validation has been carried out on a significant dataset of images that has been specifically devised and collected in order to account for the main factors that may impact on the counting accuracy and, in particular, the acquisition technology (traditional RGB camera and depth sensor), the installation scenario (indoor and outdoor), the density of the people flow (isolated people and groups of persons). Results confirm that the method can achieve an accuracy ranging between 90% and 98% depending on the adopted sensor technology and on the complexity of the scenario.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.