Budapest University of Technology and Economics, Budapest
Department of Computer Science and Information Theory


Information theory 2004 Fall


Lecturers:
András Antos
Classes per week:
4
Credits:
5
Exam:
written
December 23, 2004, 8 a.m. V2.628a and
January 19, 2005, 8 a.m. Ch.max.
Schedule, place:
Monday 12:15-13:45, R.507
Tuesday 10:15-11:45, V2.706
Remarks:

Tematics

Entropy:
entropy, joint entropy, conditional entropy, mutual information, conditional mutual information, relative entropy, conditional relative entropy, chain rules, Jensen inequality, logsum inequality, data processing inequality, Fano's inequality.
Data Compression:
source codes, Kraft inequality, McMillan, connection with entropy - lower bound for L, Shannon codes - upper bound for L, Huffman codes, optimality, competitive optimality of Shannon code.
Entropy Rates of a Stochastic Process:
Markov chains, entropy rate.
Asymptotic Equipartition Property:
The AEP, consequences for data compression, high probability sets and the typical set.
Channel Capacity:
noiseless binary channel, noisy channel with no overlap, noisy typewriter, binary symmetric channel, binary erasure channel, symmetric channels, properties of channel capacity, the channel coding model, jointly typical sequences, channel coding theorem, zero-error codes, Fano's inequality and the converse to the channel coding theorem, feedback capacity.
Adaptive Huffman code, Lempel-Ziv-Welch algorithm.

Related links

Cavendish Laboratory, Cambridge, Great Britain, David J.C. MacKay: A Short Course in Information Theory in 1995.

Back to the Home Page


Updated: Jul 24, 2010
aantos NOSPAMkukacNOSPAM gmail pont com