In this paper we deal with the experimental evaluation of tampered image detection algorithms. These algorithms aim at establishing if any manipulation has been carried out on a digital image. In detail, we focus on the evaluation of the CASIA Tampered Image Detection Evaluation (CASIA TIDE) public dataset of images, the de facto standard for evaluating these class of algorithms. Our analysis has been performed using the algorithm of Lin et al. for JPEG tampered image detection as benchmark. The results proved that the images of the dataset contain some statistical artifacts that may help the detection process. To confirm this, we first used this dataset to evaluate the performance of the Lin et al. algorithm. According to our results, the considered algorithm performs very well on this dataset. Some variants of the original algorithm have been developed expressly tuned on these artifacts. These variants performed better than their original counterpart. Then a new unbiased dataset has been assembled and a new set of experiments has been executed with these images. The results showed that the performance of the algorithm and its variants radically decreased, proving that the CASIA TIDE statistical artifacts cause interferences on the detection process. This problem is particularly important in the biometric field, because many image-based biometric systems rely on the assumption that input images have not been manipulated. Indeed, a faithful experimental evaluation must be based on unbiased input dataset to get well founded results. Therefore, the selection of a reliable image tampering detection algorithm is crucial. A preliminary version of this work has been presented in Cattaneo and Roscigno (2014) [6].

Improving the experimental analysis of tampered image detection algorithms for biometric systems

Cattaneo, Giuseppe;Roscigno, Gianluca;Ferraro Petrillo, Umberto
2018-01-01

Abstract

In this paper we deal with the experimental evaluation of tampered image detection algorithms. These algorithms aim at establishing if any manipulation has been carried out on a digital image. In detail, we focus on the evaluation of the CASIA Tampered Image Detection Evaluation (CASIA TIDE) public dataset of images, the de facto standard for evaluating these class of algorithms. Our analysis has been performed using the algorithm of Lin et al. for JPEG tampered image detection as benchmark. The results proved that the images of the dataset contain some statistical artifacts that may help the detection process. To confirm this, we first used this dataset to evaluate the performance of the Lin et al. algorithm. According to our results, the considered algorithm performs very well on this dataset. Some variants of the original algorithm have been developed expressly tuned on these artifacts. These variants performed better than their original counterpart. Then a new unbiased dataset has been assembled and a new set of experiments has been executed with these images. The results showed that the performance of the algorithm and its variants radically decreased, proving that the CASIA TIDE statistical artifacts cause interferences on the detection process. This problem is particularly important in the biometric field, because many image-based biometric systems rely on the assumption that input images have not been manipulated. Indeed, a faithful experimental evaluation must be based on unbiased input dataset to get well founded results. Therefore, the selection of a reliable image tampering detection algorithm is crucial. A preliminary version of this work has been presented in Cattaneo and Roscigno (2014) [6].
2018
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4702189
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 3
social impact