Department of Computer Science and Information Theory

**Lecturers:**- András Antos
**Classes per week:**- 4
**Credits:**- 5
**Exam:**- written

December 23, 2004, 8 a.m. V2.628a and

January 19, 2005, 8 a.m. Ch.max. **Schedule, place:**- Monday 12:15-13:45, R.507

Tuesday 10:15-11:45, V2.706 **Remarks:**- T.M. Cover and J.A. Thomas, ``Elements of Information Theory'', John Wiley & Sons, Inc., New York, 1991.
L. Györfi, S. Gyõri, and I. Vajda, ``Információ- és kódelmélet'' (Information and Code Theory), 2 ^{nd}ed., TypoTEX Kiadó, 2002 (textbook in Hungarian).- T. Linder and G. Lugosi, ``Bevezetés az informacióelméletbe'' (Introduction to Information Theory), Technical University of Budapest, 1990 (course notes in Hungarian).

- Entropy:
- entropy, joint entropy, conditional entropy, mutual information, conditional mutual information, relative entropy, conditional relative entropy, chain rules, Jensen inequality, logsum inequality, data processing inequality, Fano's inequality.
- Data Compression:
- source codes, Kraft inequality, McMillan, connection with entropy - lower bound for L, Shannon codes - upper bound for L, Huffman codes, optimality, competitive optimality of Shannon code.
- Entropy Rates of a Stochastic Process:
- Markov chains, entropy rate.
- Asymptotic Equipartition Property:
- The AEP, consequences for data compression, high probability sets and the typical set.
- Channel Capacity:
- noiseless binary channel, noisy channel with no overlap, noisy typewriter, binary symmetric channel, binary erasure channel, symmetric channels, properties of channel capacity, the channel coding model, jointly typical sequences, channel coding theorem, zero-error codes, Fano's inequality and the converse to the channel coding theorem, feedback capacity. Adaptive Huffman code, Lempel-Ziv-Welch algorithm.

Back to the Home Page

Updated: Jul 24, 2010