In this paper, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained formulation includes consensus optimization as special case, and allows for more general task relatedness models such as multitask smoothness and coupled optimization. In order to cope with communication constraints, we propose and study a quantized differential based approach where the communicated estimates among agents are quantized. The analysis shows that, under some general conditions on the quantization noise, and for sufficiently small step-sizes µ, the strategy is stable in the mean-square error sense. The analysis also reveals the influence of the gradient and quantization noises on the performance.
FINITE BIT QUANTIZATION FOR DECENTRALIZED LEARNING UNDER SUBSPACE CONSTRAINTS
Carpentiero M.;Matta V.;
2022-01-01
Abstract
In this paper, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained formulation includes consensus optimization as special case, and allows for more general task relatedness models such as multitask smoothness and coupled optimization. In order to cope with communication constraints, we propose and study a quantized differential based approach where the communicated estimates among agents are quantized. The analysis shows that, under some general conditions on the quantization noise, and for sufficiently small step-sizes µ, the strategy is stable in the mean-square error sense. The analysis also reveals the influence of the gradient and quantization noises on the performance.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.