Channel Capacity
- (X,p(y∣x),Y): send x, with probability p(y∣x), receiver get y
- Channel capacity: C=maxp(x)I(X;Y)
- Strategy to calculate C
- I(X;Y)=H(Y)−H(Y∣X)
- convex sets: dmin=mina∈A,b∈Bd(a,b)
- Examples:
- Noiseless Binary Channel: C=1,p(x)=(21,21)
- Noisy Channel with nonoverlapping Outputs: in fact not noisy
- Noisy Typewriter: log13
- Binary Symmetric Channel: C=1−H(p)
- Binary Erasure Channel: maxp(x)H(Y)−H(α)=1−Pe
- symmetric channel: channel transition matrix row and columnr are permutations of each other C=log∣Y∣−H(r)
- weakly symmetric: rows
Channel Coding Theorem
- Message W (Encoder) Xn (Channel) Yn (Decoder) W^
- maxnH(W)
- Message: can be ordered and denoted by index, logM symbols
- discrete memoryless channel (DMC): p(yk∣xk,yk−1)=p(yk∣xk)
- without feedback: p(xk∣xk−1,yk−1)=p(xk∣xk−1)
- memoryless + no feedback: H(Yn∣Xn)=∑i=1nH(Yi∣Xi)
- Markov chain: W→Xn→Yn→W^
- DMC: C=maxI(X;Y)
DMC
- codebook (f(w)=xn) shared between sender and receiver
- (M,n) code
- index set {1,2,⋯,M}
- encoding function f(x):{1,2,⋯,M}→Xn, yielding codewords xn(1),⋯,xn(M)
- decoding function g(x)
- Conditional probability of error: λi=∑ynp(yn∣xn(i))[g(yn=i)]
- maximal probability of error: λ(n)=maxλi
- average: Pe(n)=M1∑λi
- rate R of (M,n) code: R=nlogM bits per transmission
- achievable R: sequence (2nR,n) codes such that the maximal probability of error λ(n)→0 when n→∞
- capacity: supremum of all achievable rates
- Channel coding theorem: C=maxp(x)I(X;Y)
- Achievability: For any r<C,∃(2nr,n) code
- Converse: For any r>C,λe>0
Intuition for Channel Capacity: 2nI(X;Y)=2n(H(Y)−H(Y∣X)) (手电筒)
Proof: R≤C; ∀R<C, achievable
Feedback Capacity
- (2nR,n) feedback code: sequence of mappings xi(W,Yi−1)
- CFB=C=maxp(x)I(X;Y) (Feedback cannot increase capacity)
Source-Channel Separation
- data compression: R>H
- data transmission: R<C
- Source-channel coding theorem
- if V1,V2,⋯,Vn is a finite alphabet stochastic process that satisfies the AEP and H(V)<C, exists source-channel code with probability of error →0
- if H(V)>C, probability of error is bounded away from zero
Error Correction Code
- Repetition code
- Parity check code
- Hamming code (n,k,d)
- BCH code
- Convolutional code
- LDPC code
- Polar code