Information Theory

Unit aims

To give a rigorous and modern introduction into Shannon's theory of information, with emphasis on fundamental concepts and mathematical techniques.

Unit description

Shannon's information theory underlies many aspects of modern life, including streaming an MP3 or movie, or taking and storing digital photos. It is one of the great intellectual achievements of the 20th century, which continues to inspire communications engineering and to generate challenging mathematical problems. Recently it has extended dramatically into physics as quantum information theory. The course is about the fundamental ideas of this theory: data compression and reliable communication over noisy channels.

It is a statistical theory, so notions of probability play a great role, and in particular laws of large numbers as well as the concept of entropy are fundamental, culminating in Shannon's coding theorems. The course aims at demonstrating information theoretical modelling, and the mathematical techniques required will be rigorously developed.

It is a natural companion to the Quantum Information course offered in Mathematics (MATHM5610), and to a certain degree to Cryptography B (COMSM0007), offered in Computer Science, and Communications (EENG 22000), in Electrical Engineering. It may also be interesting to physicists having attended Statistical Physics (PHYS30300).

Learning objectives

This unit should enable students to:

  • Understand how information problems are modeled and solved
  • Model and solve problems of information theory: data compression and channel coding
  • Discuss basic concepts such as entropy, mutual information, relative entropy, capacity
  • Use information theoretical methods to tackle problems arising in a number of settings

Transferable skills

Mathematical - Knowledge of basic information theory; probabilistic reasoning.

General skills - Modelling, problem solving and logical analysis. Assimilation and use of complex and novel ideas.

Syllabus

  • Information sources and review of probability
  • Introduction to entropy and related quantities
  • Variable length coding and performance limits
  • (Lossy) block data compression
  • Noisy channels and Shannon's channel coding theorem

Reading and References

There exist many textbooks on the elements of information theory.

  • R B Ash. Information Theory, Dover Publications, 1990
  • D J C Mackay. Information Theory, Inference, and Learning Algorithms, CUP, 2003 - free download from http://www.inference.phy.cam.ac.uk/itprnn/book.html
  • T M Cover & J A Thomas. Elements of Information Theory, Wiley Interscience, 1991

Other useful references are:

  • C E Shannon & W Weaver. The Mathematical Theory of Communication, University of Illinois Press, 1963
  • I Csiszar & J Koerner. Information Theory: Coding Theorems for Discrete Memoryless Systems (2nd ed.), Akademiai Klado, Budapest, 1997

The course only requires elementary probability theory, but students who have taken further probability will find some of the course content easier. A very good reference is

  • G R Grimmett & D Welsh. Probability: An Introduction, Oxford University Press, 1986

Unit code: MATH34600
Level of study: H/6
Credit points: 10
Teaching block (weeks): 1 (7-12)
Lecturer: Professor Oliver Johnson

Pre-requisites

MATH11300 Probability 1 OR Level 2 Physics (MATH 11400 Statistics 1 is helpful, but not necessary)

Co-requisites

None

Methods of teaching

Lectures. Exercises to be done by students, problem classes.

Methods of Assessment

The pass mark for this unit is 40.

The final mark is calculated as follows:

  • 100% from a 1 hour 30 minute exam in January

NOTE: Calculators of an approved type (non-programmable, no text facility) are allowed.

For information resit arrangements, please see the re-sit page on the intranet.

Please use these links for further information on relative weighting and marking criteria.

Further exam information can be found on the Maths Intranet.

Edit this page