Skip to Content
Course cluster

Information Theory

A preserved cluster of undergraduate notes grouped by subject area.

7 notes

01

1. Entropy

2020-12-22

Preliminary A mathematical theory of communicationo Shannon 1948 Convexity: $f(\sum ip ix i)\leq \sum ip if(x i)$ $f(E p x i)\leq E pf(x i)$ convex: $f''(x)\geq 0$ $f(x)= x\log x$...

02

2. AEP

2020-12-22

Convergence of random variables In probability (convergence in probability): $P(|X n X|\leq\epsilon)\rightarrow 1$ In mean square: $E(X n X)^2\rightarrow 0$ With probability 1 (alm...

03

3. Entropy Rate

2020-12-22

Stationary Process stationary: $P(X 1=x 1,X 2=x 2,\cdots,X n=x n)=P(X {1+l}=x 1,X 2=x {2+l},\cdots,X {n+l}=x n)$ Gaussian process Stationary Markov Chain Stationary Distribution of...

04

4. Data Compression

2020-12-24

Existence of Code A source code $C$ for a random variable $X$ is a mapping from $\mathcal{X}$ to $D^ $ $X$: the range of $X$ $D$ ary alphabet is $\mathcal{D}=\{0,1,\cdots,D 1\}$ $C...

05

5. Channel Capacity

2020-12-24

Channel Capacity $(\mathcal{X},p(y|x),\mathcal{Y})$: send $x$, with probability $p(y|x)$, receiver get $y$ Channel capacity: $C=\max {p(x)}I(X;Y)$ Strategy to calculate $C$ $I(X;Y)...

06

6. Differential Entropy

2020-12-24

Entropy (continuous) $X$ with cumulative distribution function $F(x)=Pr(X\leq x)$ support set of $X$: $f(x) 0$ differential entropy $h(x)$: $h(X)= \int Sf(x)\log f(x)dx$ $h(X+c) =...

07

7. Gaussian Channel

2020-12-24

Gaussian Channel continuous alphet channel: input $X i$, noise $Z i$, output $Y i$ Gaussian channel: $Y i=X i+Z i,Z i\sim\mathcal{N}(0,N)$ Energy Constraint: $\frac{1}{n}\sum {i=1}...