Information Theory
A preserved cluster of undergraduate notes grouped by subject area.
7 notes
1. Entropy
Preliminary A mathematical theory of communicationo Shannon 1948 Convexity: $f(\sum ip ix i)\leq \sum ip if(x i)$ $f(E p x i)\leq E pf(x i)$ convex: $f''(x)\geq 0$ $f(x)= x\log x$...
2. AEP
Convergence of random variables In probability (convergence in probability): $P(|X n X|\leq\epsilon)\rightarrow 1$ In mean square: $E(X n X)^2\rightarrow 0$ With probability 1 (alm...
3. Entropy Rate
Stationary Process stationary: $P(X 1=x 1,X 2=x 2,\cdots,X n=x n)=P(X {1+l}=x 1,X 2=x {2+l},\cdots,X {n+l}=x n)$ Gaussian process Stationary Markov Chain Stationary Distribution of...
4. Data Compression
Existence of Code A source code $C$ for a random variable $X$ is a mapping from $\mathcal{X}$ to $D^ $ $X$: the range of $X$ $D$ ary alphabet is $\mathcal{D}=\{0,1,\cdots,D 1\}$ $C...
5. Channel Capacity
Channel Capacity $(\mathcal{X},p(y|x),\mathcal{Y})$: send $x$, with probability $p(y|x)$, receiver get $y$ Channel capacity: $C=\max {p(x)}I(X;Y)$ Strategy to calculate $C$ $I(X;Y)...
6. Differential Entropy
Entropy (continuous) $X$ with cumulative distribution function $F(x)=Pr(X\leq x)$ support set of $X$: $f(x) 0$ differential entropy $h(x)$: $h(X)= \int Sf(x)\log f(x)dx$ $h(X+c) =...
7. Gaussian Channel
Gaussian Channel continuous alphet channel: input $X i$, noise $Z i$, output $Y i$ Gaussian channel: $Y i=X i+Z i,Z i\sim\mathcal{N}(0,N)$ Energy Constraint: $\frac{1}{n}\sum {i=1}...