fbpx
Wikipedia

Slepian–Wolf coding

In information theory and communication, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a result in distributed source coding discovered by David Slepian and Jack Wolf in 1973. It is a method of theoretically coding two lossless compressed correlated sources.[1]

Problem setup edit

Distributed coding is the coding of two, in this case, or more dependent sources with separate encoders and a joint decoder. Given two statistically dependent independent and identically distributed finite-alphabet random sequences   and  , the Slepian–Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources.

 

Theorem edit

The bound for the lossless coding rates as shown below:[1]

 
 
 

 

If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is   and   for   and   respectively, where   and   are the entropies of   and  . However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of   and   is larger than their joint entropy   and none of the sources is encoded with a rate smaller than its entropy, distributed coding can achieve arbitrarily small error probability for long sequences.[1]

A special case of distributed coding is compression with decoder side information, where source   is available at the decoder side but not accessible at the encoder side. This can be treated as the condition that   has already been used to encode  , while we intend to use   to encode  . In other words, two isolated sources can compress data as efficiently as if they were communicating with each other. The whole system is operating in an asymmetric way (compression rate for the two sources are asymmetric).[1]

This bound has been extended to the case of more than two correlated sources by Thomas M. Cover in 1975,[2] and similar results were obtained in 1976 by Aaron D. Wyner and Jacob Ziv with regard to lossy coding of joint Gaussian sources.[3]

See also edit

References edit

  1. ^ a b c d Slepian & Wolf 1973, pp. 471–480.
  2. ^ Cover 1975, pp. 226–228.
  3. ^ Wyner & Ziv 1976, pp. 1–10.

Sources edit

  • Cover, Thomas M. (March 1975). "A proof of the data compression theorem of Slepian and Wolf for ergodic sources" by T.". IEEE Transactions on Information Theory. 21 (2): 226–228. doi:10.1109/TIT.1975.1055356. ISSN 0018-9448.
  • Slepian, David S.; Wolf, Jack K. (July 1973). "Noiseless coding of correlated information sources". IEEE Transactions on Information Theory. 19 (4): 471–480. doi:10.1109/TIT.1973.1055037. ISSN 0018-9448.
  • Wyner, Aaron D.; Ziv, Jacob (January 1976). "The rate-distortion function for source coding with side information at the decoder". IEEE Transactions on Information Theory. 22 (1): 1–10. CiteSeerX 10.1.1.137.494. doi:10.1109/TIT.1976.1055508. ISSN 0018-9448.

External links edit

  • Wyner-Ziv Coding of Video algorithm for video compression that performs close to the Slepian–Wolf bound (with links to source code).


slepian, wolf, coding, information, theory, communication, also, known, slepian, wolf, bound, result, distributed, source, coding, discovered, david, slepian, jack, wolf, 1973, method, theoretically, coding, lossless, compressed, correlated, sources, problem, . In information theory and communication the Slepian Wolf coding also known as the Slepian Wolf bound is a result in distributed source coding discovered by David Slepian and Jack Wolf in 1973 It is a method of theoretically coding two lossless compressed correlated sources 1 Problem setup editDistributed coding is the coding of two in this case or more dependent sources with separate encoders and a joint decoder Given two statistically dependent independent and identically distributed finite alphabet random sequences Xn displaystyle X n nbsp and Yn displaystyle Y n nbsp the Slepian Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources nbsp Theorem editThe bound for the lossless coding rates as shown below 1 RX H X Y displaystyle R X geq H X Y nbsp RY H Y X displaystyle R Y geq H Y X nbsp RX RY H X Y displaystyle R X R Y geq H X Y nbsp nbsp If both the encoder and the decoder of the two sources are independent the lowest rate it can achieve for lossless compression is H X displaystyle H X nbsp and H Y displaystyle H Y nbsp for X displaystyle X nbsp and Y displaystyle Y nbsp respectively where H X displaystyle H X nbsp and H Y displaystyle H Y nbsp are the entropies of X displaystyle X nbsp and Y displaystyle Y nbsp However with joint decoding if vanishing error probability for long sequences is accepted the Slepian Wolf theorem shows that much better compression rate can be achieved As long as the total rate of X displaystyle X nbsp and Y displaystyle Y nbsp is larger than their joint entropy H X Y displaystyle H X Y nbsp and none of the sources is encoded with a rate smaller than its entropy distributed coding can achieve arbitrarily small error probability for long sequences 1 A special case of distributed coding is compression with decoder side information where source Y displaystyle Y nbsp is available at the decoder side but not accessible at the encoder side This can be treated as the condition that RY H Y displaystyle R Y H Y nbsp has already been used to encode Y displaystyle Y nbsp while we intend to use H X Y displaystyle H X Y nbsp to encode X displaystyle X nbsp In other words two isolated sources can compress data as efficiently as if they were communicating with each other The whole system is operating in an asymmetric way compression rate for the two sources are asymmetric 1 This bound has been extended to the case of more than two correlated sources by Thomas M Cover in 1975 2 and similar results were obtained in 1976 by Aaron D Wyner and Jacob Ziv with regard to lossy coding of joint Gaussian sources 3 See also editData compression Data synchronization Synchronization computer science DISCUS Timeline of information theoryReferences edit a b c d Slepian amp Wolf 1973 pp 471 480 Cover 1975 pp 226 228 Wyner amp Ziv 1976 pp 1 10 Sources edit Cover Thomas M March 1975 A proof of the data compression theorem of Slepian and Wolf for ergodic sources by T IEEE Transactions on Information Theory 21 2 226 228 doi 10 1109 TIT 1975 1055356 ISSN 0018 9448 Slepian David S Wolf Jack K July 1973 Noiseless coding of correlated information sources IEEE Transactions on Information Theory 19 4 471 480 doi 10 1109 TIT 1973 1055037 ISSN 0018 9448 Wyner Aaron D Ziv Jacob January 1976 The rate distortion function for source coding with side information at the decoder IEEE Transactions on Information Theory 22 1 1 10 CiteSeerX 10 1 1 137 494 doi 10 1109 TIT 1976 1055508 ISSN 0018 9448 External links editWyner Ziv Coding of Video algorithm for video compression that performs close to the Slepian Wolf bound with links to source code nbsp This article related to telecommunications is a stub You can help Wikipedia by expanding it vte Retrieved from https en wikipedia org w index php title Slepian Wolf coding amp oldid 1111013819, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.