papers AI Learner
The Github is limit! Click to go to the new site.

Network Compression: Memory-Assisted Universal Coding of Sources with Correlated Parameters

2012-10-08
Ahmad Beirami, Faramarz Fekri

Abstract

In this paper, we propose {\em distributed network compression via memory}. We consider two spatially separated sources with correlated unknown source parameters. We wish to study the universal compression of a sequence of length $n$ from one of the sources provided that the decoder has access to (i.e., memorized) a sequence of length $m$ from the other source. In this setup, the correlation does not arise from symbol-by-symbol dependency of two outputs from the two sources (as in Slepian-Wolf setup). Instead, the two sequences are correlated because they are originated from the two sources with \emph{unknown} correlated parameters. The finite-length nature of the compression problem at hand requires considering a notion of almost lossless source coding, where coding incurs an error probability $p_e(n)$ that vanishes as sequence length $n$ grows to infinity. We obtain bounds on the redundancy of almost lossless codes when the decoder has access to a random memory of length $m$ as a function of the sequence length $n$ and the permissible error probability $p_e(n)$. Our results demonstrate that distributed network compression via memory has the potential to significantly improve over conventional end-to-end compression when sufficiently large memory from previous communications is available to the decoder.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1210.2144

PDF

https://arxiv.org/pdf/1210.2144


Similar Posts

Comments