It is well known that the entropy H(X) of a finite random variable is always greater or equal to the entropy H(f(X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tights bounds on H(f(X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we prove a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximum and the minimum probability is known. Our lower bound improves previous results in the literature, and it could find applications outside the present scenario.

H(X) vs. H(f (X))

Gargano, Luisa;Vaccaro, Ugo
2017-01-01

Abstract

It is well known that the entropy H(X) of a finite random variable is always greater or equal to the entropy H(f(X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tights bounds on H(f(X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we prove a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximum and the minimum probability is known. Our lower bound improves previous results in the literature, and it could find applications outside the present scenario.
2017
9781509040964
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11386/4702476
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact