Department of Computer Science and Information Theory
Information theory 2004 Fall
- András Antos
- Classes per week:
December 23, 2004, 8 a.m. V2.628a and
January 19, 2005, 8 a.m. Ch.max.
- Schedule, place:
- Monday 12:15-13:45, R.507
Tuesday 10:15-11:45, V2.706
T.M. Cover and J.A. Thomas, ``Elements of Information Theory'', John Wiley & Sons, Inc., New York, 1991.
S. Gyõri, and
``Információ- és kódelmélet'' (Information and Code Theory), 2nd ed., TypoTEX Kiadó, 2002 (textbook in Hungarian).
- T. Linder and
``Bevezetés az informacióelméletbe'' (Introduction to Information Theory), Technical University of Budapest, 1990 (course notes in Hungarian).
Cavendish Laboratory, Cambridge, Great Britain, David J.C. MacKay:
A Short Course in Information Theory in 1995.
- entropy, joint entropy, conditional entropy, mutual information, conditional mutual information, relative entropy, conditional relative entropy, chain rules, Jensen inequality, logsum inequality, data processing inequality, Fano's inequality.
- Data Compression:
- source codes, Kraft inequality, McMillan, connection with entropy - lower bound for L, Shannon codes - upper bound for L, Huffman codes, optimality, competitive optimality of Shannon code.
- Entropy Rates of a Stochastic Process:
- Markov chains, entropy rate.
- Asymptotic Equipartition Property:
The AEP, consequences for data compression, high probability sets and the typical set.
- noiseless binary channel, noisy channel with no overlap, noisy typewriter, binary symmetric channel, binary erasure channel, symmetric channels, properties of channel capacity, the channel coding model, jointly typical sequences, channel coding theorem, zero-error codes, Fano's inequality and the converse to the channel coding theorem, feedback capacity.
Adaptive Huffman code, Lempel-Ziv-Welch algorithm.
Back to the Home Page
Updated: Jul 24, 2010