Given two discrete random variables X and Y, with probability distributions p=(p1, ⋯, pn) and q=(q1, ⋯ qm), respectively, let us denote by C(p, q) the set of all couplings of p and q, that is, the set of all bivariate probability distributions that have p and q as marginals. In this paper, we study the problem of finding a joint probability distribution in C(p, q) of minimum entropy (equivalently, a coupling that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible at most by 1 bit, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum-entropy joint distribution of arbitrary k≥ 2 discrete random variables X1, ⋯, Xk, consistent with the known k marginal distributions of the individual random variables X1, ⋯, Xk. In this case, our algorithm has an additive gap of at most log k from optimum. We also discuss several related applications of our findings and extensions of our results to entropies different from the Shannon entropy.
Minimum-Entropy Couplings and Their Applications
Gargano, Luisa;Vaccaro, Ugo
2019
Abstract
Given two discrete random variables X and Y, with probability distributions p=(p1, ⋯, pn) and q=(q1, ⋯ qm), respectively, let us denote by C(p, q) the set of all couplings of p and q, that is, the set of all bivariate probability distributions that have p and q as marginals. In this paper, we study the problem of finding a joint probability distribution in C(p, q) of minimum entropy (equivalently, a coupling that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible at most by 1 bit, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum-entropy joint distribution of arbitrary k≥ 2 discrete random variables X1, ⋯, Xk, consistent with the known k marginal distributions of the individual random variables X1, ⋯, Xk. In this case, our algorithm has an additive gap of at most log k from optimum. We also discuss several related applications of our findings and extensions of our results to entropies different from the Shannon entropy.| File | Dimensione | Formato | |
|---|---|---|---|
| 
									
										
										
										
										
											
												
												
												    
												
											
										
									
									
										
										
											Minimization-revised-18Jan2019 (1).pdf
										
																				
									
										
											 accesso aperto 
											Tipologia:
											Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
										 
									
									
									
									
										
											Licenza:
											
											
												Creative commons
												
												
													
													
													
												
												
											
										 
									
									
										Dimensione
										495.4 kB
									 
									
										Formato
										Adobe PDF
									 
										
										
								 | 
								495.4 kB | Adobe PDF | Visualizza/Apri | 
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


