fbpx
Wikipedia

Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability. [1][2]

Information theory, developed by Claude E. Shannon in 1948, defines the notion of channel capacity and provides a mathematical model by which it may be computed. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution. [3]

The notion of channel capacity has been central to the development of modern wireline and wireless communication systems, with the advent of novel error correction coding mechanisms that have resulted in achieving performance very close to the limits promised by channel capacity.

Formal definition

The basic mathematical model for a communication system is the following:

 

where:

  •   is the message to be transmitted;
  •   is the channel input symbol (  is a sequence of   symbols) taken in an alphabet  ;
  •   is the channel output symbol (  is a sequence of   symbols) taken in an alphabet  ;
  •   is the estimate of the transmitted message;
  •   is the encoding function for a block of length  ;
  •   is the noisy channel, which is modeled by a conditional probability distribution; and,
  •   is the decoding function for a block of length  .

Let   and   be modeled as random variables. Furthermore, let   be the conditional probability distribution function of   given  , which is an inherent fixed property of the communication channel. Then the choice of the marginal distribution   completely determines the joint distribution   due to the identity

 

which, in turn, induces a mutual information  . The channel capacity is defined as

 

where the supremum is taken over all possible choices of  .

Additivity of channel capacity

Channel capacity is additive over independent channels.[4] It means that using two independent channels in a combined manner provides the same theoretical capacity as using them independently. More formally, let   and   be two independent channels modelled as above;   having an input alphabet   and an output alphabet  . Idem for  . We define the product channel   as  

This theorem states:

 
Proof

We first show that  .

Let   and   be two independent random variables. Let   be a random variable corresponding to the output of   through the channel  , and   for   through  .

By definition  .

Since   and   are independent, as well as   and  ,   is independent of  . We can apply the following property of mutual information:  

For now we only need to find a distribution   such that  . In fact,   and  , two probability distributions for   and   achieving   and  , suffice:

 

ie.  


Now let us show that  .

Let   be some distribution for the channel   defining   and the corresponding output  . Let   be the alphabet of  ,   for  , and analogously   and  .

By definition of mutual information, we have

 

Let us rewrite the last term of entropy.

 

By definition of the product channel,  . For a given pair  , we can rewrite   as:

 

By summing this equality over all  , we obtain  .

We can now give an upper bound over mutual information:

 

This relation is preserved at the supremum. Therefore

 


Combining the two inequalities we proved, we obtain the result of the theorem:

 

Shannon capacity of a graph

If G is an undirected graph, it can be used to define a communications channel in which the symbols are the graph vertices, and two codewords may be confused with each other if their symbols in each position are equal or adjacent. The computational complexity of finding the Shannon capacity of such a channel remains open, but it can be upper bounded by another important graph invariant, the Lovász number.[5]

Noisy-channel coding theorem

The noisy-channel coding theorem states that for any error probability ε > 0 and for any transmission rate R less than the channel capacity C, there is an encoding and decoding scheme transmitting data at rate R whose error probability is less than ε, for a sufficiently large block length. Also, for any rate greater than the channel capacity, the probability of error at the receiver goes to 0.5 as the block length goes to infinity.

Example application

An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:

 

C is measured in bits per second if the logarithm is taken in base 2, or nats per second if the natural logarithm is used, assuming B is in hertz; the signal and noise powers S and N are expressed in a linear power unit (like watts or volts2). Since S/N figures are often cited in dB, a conversion may be needed. For example, a signal-to-noise ratio of 30 dB corresponds to a linear power ratio of  .

Channel capacity in wireless communications

This section[6] focuses on the single-antenna, point-to-point scenario. For channel capacity in systems with multiple antennas, see the article on MIMO.

Bandlimited AWGN channel

 
AWGN channel capacity with the power-limited regime and bandwidth-limited regime indicated. Here,  ; B and C can be scaled proportionally for other values.

If the average received power is   [W], the total bandwidth is   in Hertz, and the noise power spectral density is   [W/Hz], the AWGN channel capacity is

  [bits/s],

where   is the received signal-to-noise ratio (SNR). This result is known as the Shannon–Hartley theorem.[7]

When the SNR is large (SNR ≫ 0 dB), the capacity   is logarithmic in power and approximately linear in bandwidth. This is called the bandwidth-limited regime.

When the SNR is small (SNR ≪ 0 dB), the capacity   is linear in power but insensitive to bandwidth. This is called the power-limited regime.

The bandwidth-limited regime and power-limited regime are illustrated in the figure.

Frequency-selective AWGN channel

The capacity of the frequency-selective channel is given by so-called water filling power allocation,

 

where   and   is the gain of subchannel  , with   chosen to meet the power constraint.

Slow-fading channel

In a slow-fading channel, where the coherence time is greater than the latency requirement, there is no definite capacity as the maximum rate of reliable communications supported by the channel,  , depends on the random channel gain  , which is unknown to the transmitter. If the transmitter encodes data at rate   [bits/s/Hz], there is a non-zero probability that the decoding error probability cannot be made arbitrarily small,

 ,

in which case the system is said to be in outage. With a non-zero probability that the channel is in deep fade, the capacity of the slow-fading channel in strict sense is zero. However, it is possible to determine the largest value of   such that the outage probability   is less than  . This value is known as the  -outage capacity.

Fast-fading channel

In a fast-fading channel, where the latency requirement is greater than the coherence time and the codeword length spans many coherence periods, one can average over many independent channel fades by coding over a large number of coherence time intervals. Thus, it is possible to achieve a reliable rate of communication of   [bits/s/Hz] and it is meaningful to speak of this value as the capacity of the fast-fading channel.

See also

Advanced Communication Topics

External links

  • "Transmission rate of a channel", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
  • AWGN Channel Capacity with various constraints on the channel input (interactive demonstration)

References

  1. ^ Saleem Bhatti. . Lecture notes for M.Sc. Data Communication Networks and Distributed Systems D51 -- Basic Communications and Networks. Archived from the original on 2007-08-21.
  2. ^ Jim Lesurf. "Signals look like noise!". Information and Measurement, 2nd ed.
  3. ^ Thomas M. Cover, Joy A. Thomas (2006). Elements of Information Theory. John Wiley & Sons, New York. ISBN 9781118585771.
  4. ^ Cover, Thomas M.; Thomas, Joy A. (2006). "Chapter 7: Channel Capacity". Elements of Information Theory (Second ed.). Wiley-Interscience. pp. 206–207. ISBN 978-0-471-24195-9.
  5. ^ Lovász, László (1979), "On the Shannon Capacity of a Graph", IEEE Transactions on Information Theory, IT-25 (1): 1–7, doi:10.1109/tit.1979.1055985.
  6. ^ David Tse, Pramod Viswanath (2005), Fundamentals of Wireless Communication, Cambridge University Press, UK, ISBN 9780521845274
  7. ^ The Handbook of Electrical Engineering. Research & Education Association. 1996. p. D-149. ISBN 9780878919819.

channel, capacity, electrical, engineering, computer, science, information, theory, tight, upper, bound, rate, which, information, reliably, transmitted, over, communication, channel, following, terms, noisy, channel, coding, theorem, channel, capacity, given,. Channel capacity in electrical engineering computer science and information theory is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel Following the terms of the noisy channel coding theorem the channel capacity of a given channel is the highest information rate in units of information per unit time that can be achieved with arbitrarily small error probability 1 2 Information theory developed by Claude E Shannon in 1948 defines the notion of channel capacity and provides a mathematical model by which it may be computed The key result states that the capacity of the channel as defined above is given by the maximum of the mutual information between the input and output of the channel where the maximization is with respect to the input distribution 3 The notion of channel capacity has been central to the development of modern wireline and wireless communication systems with the advent of novel error correction coding mechanisms that have resulted in achieving performance very close to the limits promised by channel capacity Contents 1 Formal definition 2 Additivity of channel capacity 3 Shannon capacity of a graph 4 Noisy channel coding theorem 5 Example application 6 Channel capacity in wireless communications 6 1 Bandlimited AWGN channel 6 2 Frequency selective AWGN channel 6 3 Slow fading channel 6 4 Fast fading channel 7 See also 7 1 Advanced Communication Topics 8 External links 9 ReferencesFormal definition EditThe basic mathematical model for a communication system is the following Message W Encoder f n E n c o d e d s e q u e n c e X n Channel p y x R e c e i v e d s e q u e n c e Y n Decoder g n E s t i m a t e d m e s s a g e W displaystyle xrightarrow text Message W begin array c hline text Encoder f n hline end array xrightarrow mathrm Encoded atop sequence X n begin array c hline text Channel p y x hline end array xrightarrow mathrm Received atop sequence Y n begin array c hline text Decoder g n hline end array xrightarrow mathrm Estimated atop message hat W where W displaystyle W is the message to be transmitted X displaystyle X is the channel input symbol X n displaystyle X n is a sequence of n displaystyle n symbols taken in an alphabet X displaystyle mathcal X Y displaystyle Y is the channel output symbol Y n displaystyle Y n is a sequence of n displaystyle n symbols taken in an alphabet Y displaystyle mathcal Y W displaystyle hat W is the estimate of the transmitted message f n displaystyle f n is the encoding function for a block of length n displaystyle n p y x p Y X y x displaystyle p y x p Y X y x is the noisy channel which is modeled by a conditional probability distribution and g n displaystyle g n is the decoding function for a block of length n displaystyle n Let X displaystyle X and Y displaystyle Y be modeled as random variables Furthermore let p Y X y x displaystyle p Y X y x be the conditional probability distribution function of Y displaystyle Y given X displaystyle X which is an inherent fixed property of the communication channel Then the choice of the marginal distribution p X x displaystyle p X x completely determines the joint distribution p X Y x y displaystyle p X Y x y due to the identity p X Y x y p Y X y x p X x displaystyle p X Y x y p Y X y x p X x which in turn induces a mutual information I X Y displaystyle I X Y The channel capacity is defined as C sup p X x I X Y displaystyle C sup p X x I X Y where the supremum is taken over all possible choices of p X x displaystyle p X x Additivity of channel capacity EditChannel capacity is additive over independent channels 4 It means that using two independent channels in a combined manner provides the same theoretical capacity as using them independently More formally let p 1 displaystyle p 1 and p 2 displaystyle p 2 be two independent channels modelled as above p 1 displaystyle p 1 having an input alphabet X 1 displaystyle mathcal X 1 and an output alphabet Y 1 displaystyle mathcal Y 1 Idem for p 2 displaystyle p 2 We define the product channel p 1 p 2 displaystyle p 1 times p 2 as x 1 x 2 X 1 X 2 y 1 y 2 Y 1 Y 2 p 1 p 2 y 1 y 2 x 1 x 2 p 1 y 1 x 1 p 2 y 2 x 2 displaystyle forall x 1 x 2 in mathcal X 1 mathcal X 2 y 1 y 2 in mathcal Y 1 mathcal Y 2 p 1 times p 2 y 1 y 2 x 1 x 2 p 1 y 1 x 1 p 2 y 2 x 2 This theorem states C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 C p 1 C p 2 Proof We first show that C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 geq C p 1 C p 2 Let X 1 displaystyle X 1 and X 2 displaystyle X 2 be two independent random variables Let Y 1 displaystyle Y 1 be a random variable corresponding to the output of X 1 displaystyle X 1 through the channel p 1 displaystyle p 1 and Y 2 displaystyle Y 2 for X 2 displaystyle X 2 through p 2 displaystyle p 2 By definition C p 1 p 2 sup p X 1 X 2 I X 1 X 2 Y 1 Y 2 displaystyle C p 1 times p 2 sup p X 1 X 2 I X 1 X 2 Y 1 Y 2 Since X 1 displaystyle X 1 and X 2 displaystyle X 2 are independent as well as p 1 displaystyle p 1 and p 2 displaystyle p 2 X 1 Y 1 displaystyle X 1 Y 1 is independent of X 2 Y 2 displaystyle X 2 Y 2 We can apply the following property of mutual information I X 1 X 2 Y 1 Y 2 I X 1 Y 1 I X 2 Y 2 displaystyle I X 1 X 2 Y 1 Y 2 I X 1 Y 1 I X 2 Y 2 For now we only need to find a distribution p X 1 X 2 displaystyle p X 1 X 2 such that I X 1 X 2 Y 1 Y 2 I X 1 Y 1 I X 2 Y 2 displaystyle I X 1 X 2 Y 1 Y 2 geq I X 1 Y 1 I X 2 Y 2 In fact p 1 displaystyle pi 1 and p 2 displaystyle pi 2 two probability distributions for X 1 displaystyle X 1 and X 2 displaystyle X 2 achieving C p 1 displaystyle C p 1 and C p 2 displaystyle C p 2 suffice C p 1 p 2 I X 1 X 2 Y 1 Y 2 I X 1 Y 1 I X 2 Y 2 C p 1 C p 2 displaystyle C p 1 times p 2 geq I X 1 X 2 Y 1 Y 2 I X 1 Y 1 I X 2 Y 2 C p 1 C p 2 ie C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 geq C p 1 C p 2 Now let us show that C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 leq C p 1 C p 2 Let p 12 displaystyle pi 12 be some distribution for the channel p 1 p 2 displaystyle p 1 times p 2 defining X 1 X 2 displaystyle X 1 X 2 and the corresponding output Y 1 Y 2 displaystyle Y 1 Y 2 Let X 1 displaystyle mathcal X 1 be the alphabet of X 1 displaystyle X 1 Y 1 displaystyle mathcal Y 1 for Y 1 displaystyle Y 1 and analogously X 2 displaystyle mathcal X 2 and Y 2 displaystyle mathcal Y 2 By definition of mutual information we haveI X 1 X 2 Y 1 Y 2 H Y 1 Y 2 H Y 1 Y 2 X 1 X 2 H Y 1 H Y 2 H Y 1 Y 2 X 1 X 2 displaystyle begin aligned I X 1 X 2 Y 1 Y 2 amp H Y 1 Y 2 H Y 1 Y 2 X 1 X 2 amp leq H Y 1 H Y 2 H Y 1 Y 2 X 1 X 2 end aligned Let us rewrite the last term of entropy H Y 1 Y 2 X 1 X 2 x 1 x 2 X 1 X 2 P X 1 X 2 x 1 x 2 H Y 1 Y 2 X 1 X 2 x 1 x 2 displaystyle H Y 1 Y 2 X 1 X 2 sum x 1 x 2 in mathcal X 1 times mathcal X 2 mathbb P X 1 X 2 x 1 x 2 H Y 1 Y 2 X 1 X 2 x 1 x 2 By definition of the product channel P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 P Y 1 y 1 X 1 x 1 P Y 2 y 2 X 2 x 2 displaystyle mathbb P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 mathbb P Y 1 y 1 X 1 x 1 mathbb P Y 2 y 2 X 2 x 2 For a given pair x 1 x 2 displaystyle x 1 x 2 we can rewrite H Y 1 Y 2 X 1 X 2 x 1 x 2 displaystyle H Y 1 Y 2 X 1 X 2 x 1 x 2 as H Y 1 Y 2 X 1 X 2 x 1 x 2 y 1 y 2 Y 1 Y 2 P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 log P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 y 1 y 2 Y 1 Y 2 P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 log P Y 1 y 1 X 1 x 1 log P Y 2 y 2 X 2 x 2 H Y 1 X 1 x 1 H Y 2 X 2 x 2 displaystyle begin aligned H Y 1 Y 2 X 1 X 2 x 1 x 2 amp sum y 1 y 2 in mathcal Y 1 times mathcal Y 2 mathbb P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 log mathbb P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 amp sum y 1 y 2 in mathcal Y 1 times mathcal Y 2 mathbb P Y 1 Y 2 y 1 y 2 X 1 X 2 x 1 x 2 log mathbb P Y 1 y 1 X 1 x 1 log mathbb P Y 2 y 2 X 2 x 2 amp H Y 1 X 1 x 1 H Y 2 X 2 x 2 end aligned By summing this equality over all x 1 x 2 displaystyle x 1 x 2 we obtain H Y 1 Y 2 X 1 X 2 H Y 1 X 1 H Y 2 X 2 displaystyle H Y 1 Y 2 X 1 X 2 H Y 1 X 1 H Y 2 X 2 We can now give an upper bound over mutual information I X 1 X 2 Y 1 Y 2 H Y 1 H Y 2 H Y 1 X 1 H Y 2 X 2 I X 1 Y 1 I X 2 Y 2 displaystyle begin aligned I X 1 X 2 Y 1 Y 2 amp leq H Y 1 H Y 2 H Y 1 X 1 H Y 2 X 2 amp I X 1 Y 1 I X 2 Y 2 end aligned This relation is preserved at the supremum Therefore C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 leq C p 1 C p 2 Combining the two inequalities we proved we obtain the result of the theorem C p 1 p 2 C p 1 C p 2 displaystyle C p 1 times p 2 C p 1 C p 2 Shannon capacity of a graph EditMain article Shannon capacity of a graph If G is an undirected graph it can be used to define a communications channel in which the symbols are the graph vertices and two codewords may be confused with each other if their symbols in each position are equal or adjacent The computational complexity of finding the Shannon capacity of such a channel remains open but it can be upper bounded by another important graph invariant the Lovasz number 5 Noisy channel coding theorem EditThe noisy channel coding theorem states that for any error probability e gt 0 and for any transmission rate R less than the channel capacity C there is an encoding and decoding scheme transmitting data at rate R whose error probability is less than e for a sufficiently large block length Also for any rate greater than the channel capacity the probability of error at the receiver goes to 0 5 as the block length goes to infinity Example application EditAn application of the channel capacity concept to an additive white Gaussian noise AWGN channel with B Hz bandwidth and signal to noise ratio S N is the Shannon Hartley theorem C B log 2 1 S N displaystyle C B log 2 left 1 frac S N right C is measured in bits per second if the logarithm is taken in base 2 or nats per second if the natural logarithm is used assuming B is in hertz the signal and noise powers S and N are expressed in a linear power unit like watts or volts2 Since S N figures are often cited in dB a conversion may be needed For example a signal to noise ratio of 30 dB corresponds to a linear power ratio of 10 30 10 10 3 1000 displaystyle 10 30 10 10 3 1000 Channel capacity in wireless communications EditThis section 6 focuses on the single antenna point to point scenario For channel capacity in systems with multiple antennas see the article on MIMO Bandlimited AWGN channel Edit Main article Shannon Hartley theorem AWGN channel capacity with the power limited regime and bandwidth limited regime indicated Here P N 0 1 displaystyle frac bar P N 0 1 B and C can be scaled proportionally for other values If the average received power is P displaystyle bar P W the total bandwidth is W displaystyle W in Hertz and the noise power spectral density is N 0 displaystyle N 0 W Hz the AWGN channel capacity is C AWGN W log 2 1 P N 0 W displaystyle C text AWGN W log 2 left 1 frac bar P N 0 W right bits s where P N 0 W displaystyle frac bar P N 0 W is the received signal to noise ratio SNR This result is known as the Shannon Hartley theorem 7 When the SNR is large SNR 0 dB the capacity C W log 2 P N 0 W displaystyle C approx W log 2 frac bar P N 0 W is logarithmic in power and approximately linear in bandwidth This is called the bandwidth limited regime When the SNR is small SNR 0 dB the capacity C P N 0 ln 2 displaystyle C approx frac bar P N 0 ln 2 is linear in power but insensitive to bandwidth This is called the power limited regime The bandwidth limited regime and power limited regime are illustrated in the figure Frequency selective AWGN channel Edit The capacity of the frequency selective channel is given by so called water filling power allocation C N c n 0 N c 1 log 2 1 P n h n 2 N 0 displaystyle C N c sum n 0 N c 1 log 2 left 1 frac P n bar h n 2 N 0 right where P n max 1 l N 0 h n 2 0 displaystyle P n max left left frac 1 lambda frac N 0 bar h n 2 right 0 right and h n 2 displaystyle bar h n 2 is the gain of subchannel n displaystyle n with l displaystyle lambda chosen to meet the power constraint Slow fading channel Edit In a slow fading channel where the coherence time is greater than the latency requirement there is no definite capacity as the maximum rate of reliable communications supported by the channel log 2 1 h 2 S N R displaystyle log 2 1 h 2 SNR depends on the random channel gain h 2 displaystyle h 2 which is unknown to the transmitter If the transmitter encodes data at rate R displaystyle R bits s Hz there is a non zero probability that the decoding error probability cannot be made arbitrarily small p o u t P log 1 h 2 S N R lt R displaystyle p out mathbb P log 1 h 2 SNR lt R in which case the system is said to be in outage With a non zero probability that the channel is in deep fade the capacity of the slow fading channel in strict sense is zero However it is possible to determine the largest value of R displaystyle R such that the outage probability p o u t displaystyle p out is less than ϵ displaystyle epsilon This value is known as the ϵ displaystyle epsilon outage capacity Fast fading channel Edit In a fast fading channel where the latency requirement is greater than the coherence time and the codeword length spans many coherence periods one can average over many independent channel fades by coding over a large number of coherence time intervals Thus it is possible to achieve a reliable rate of communication of E log 2 1 h 2 S N R displaystyle mathbb E log 2 1 h 2 SNR bits s Hz and it is meaningful to speak of this value as the capacity of the fast fading channel See also EditBandwidth computing Bandwidth signal processing Bit rate Code rate Error exponent Nyquist rate Negentropy Redundancy Sender Data compression Receiver Shannon Hartley theorem Spectral efficiency ThroughputAdvanced Communication Topics Edit MIMO Cooperative diversityExternal links Edit Transmission rate of a channel Encyclopedia of Mathematics EMS Press 2001 1994 AWGN Channel Capacity with various constraints on the channel input interactive demonstration References Edit Saleem Bhatti Channel capacity Lecture notes for M Sc Data Communication Networks and Distributed Systems D51 Basic Communications and Networks Archived from the original on 2007 08 21 Jim Lesurf Signals look like noise Information and Measurement 2nd ed Thomas M Cover Joy A Thomas 2006 Elements of Information Theory John Wiley amp Sons New York ISBN 9781118585771 Cover Thomas M Thomas Joy A 2006 Chapter 7 Channel Capacity Elements of Information Theory Second ed Wiley Interscience pp 206 207 ISBN 978 0 471 24195 9 Lovasz Laszlo 1979 On the Shannon Capacity of a Graph IEEE Transactions on Information Theory IT 25 1 1 7 doi 10 1109 tit 1979 1055985 David Tse Pramod Viswanath 2005 Fundamentals of Wireless Communication Cambridge University Press UK ISBN 9780521845274 The Handbook of Electrical Engineering Research amp Education Association 1996 p D 149 ISBN 9780878919819 This article needs additional citations for verification Please help improve this article by adding citations to reliable sources Unsourced material may be challenged and removed Find sources Channel capacity news newspapers books scholar JSTOR January 2008 Learn how and when to remove this template message Retrieved from https en wikipedia org w index php title Channel capacity amp oldid 1068127936, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.