Rate-Distortion-Perception Tradeoff for Gaussian Vector Sources
This paper studies the rate-distortion-perception (RDP) tradeoff for a Gaussian vector source coding problem where the goal is to compress the multi-component source subject to distortion and perception constraints. Specifically, the RDP setting with either the Kullback-Leibler (KL) divergence or Wasserstein-2 metric as the perception loss function is examined, and it is shown that for Gaussian vector sources, jointly Gaussian reconstructions are optimal.
ISIT Awards Session Brochures
At each ISIT, the leadership of the 51ÂÜÀò Information Theory Society presents a number of awards, including the Thomas M. Cover Dissertation Award, the James L.
Source Coding for Markov Sources With Partial Memoryless Side Information at the Decoder
We consider the one helper source coding problem posed and investigated by Ahlswede, Körner, and Wyner for a class of information sources with memory. For this class of information sources we give explicit inner and outer bounds of the admissible rate region. We also give a certain nontrivial class of information sources where the inner and outer bounds match.