Skip to main content

Unit information: Information Theory 3 in 2015/16

Please note: you are viewing unit and programme information for a past academic year. Please see the current academic year for up to date information.

Unit name Information Theory 3
Unit code MATH34600
Credit points 10
Level of study H/6
Teaching block(s) Teaching Block 1B (weeks 7 - 12)
Unit director Dr. Wiesner
Open unit status Not open
Pre-requisites

MATH11300 Probability 1 OR Level 2 Physics ( MATH 11400 Statistics 1 is helpful, but not necessary)

Co-requisites

None

School/department School of Mathematics
Faculty Faculty of Science

Description including Unit Aims

Unit aims

To give a rigorous and modern introduction into Shannon's theory of information, with emphasis on fundamental concepts and mathematical techniques.

General Description of the Unit

Shannon's information theory is one of the great intellectual achievements of the 20th century, which half a century after its conception continues to inspire communications engineering and to generate challenging mathematical problems. Recently it has extended dramatically into physics as quantum information theory. The course is about the fundamental ideas of this theory: data compression, communication via channels, error correcting codes, and simulations.

It is a statistical theory, so notions of probability play a great role, and in particular laws of large numbers as well as the concept of entropy are fundamental (and will be discussed in the course). The course contains a discussion of two models of data compression, leading to entropy as the crucial quantity; an introduction to noisy channels and error correcting codes, culminating in Shannon's channel coding theorem; the "reverse Shannon theorem" for noisy channels and rate-distortion coding; information theoretical hypothesis testing - theory of identification.

The course aims at demonstrating information theoretical modelling, and the mathematical techniques required will be rigorously developed.

It is a natural companion to the Quantum Information course offered in Mathematics (MATH M5610), and to a certain degree to Cryptography B (COMSM 0007), offered in Computer Science, and Communications (EENG 22000), in Electrical Engineering. It may also be interesting to physicists having attended Statistical Physics (PHYS 30300).

Relation to Other Units

The probabilistic nature of the problems considered and of the mathematical modellings in information theory relates this unit to the probability and statistics units at Levels 4, 5 and 6. It is very much suited as a companion to the Quantum Information unit.

Related courses in Computer Science: Cryptography B, in Electrical Engineering: Communications, and in Physics: Statistical Physics.

Further information is available on the School of Mathematics website: http://www.maths.bris.ac.uk/study/undergrad/

Intended Learning Outcomes

This unit should enable students to:

  • understand how information problems are modeled and solved;
  • model and solve problems of information theory: data compression and channel coding;
  • discuss basic concepts such as entropy, mutual information, relative entropy, capacity;
  • use information theoretical methods to tackle information theoretical problems, in particular probabilistic method and information calculus.

Transferable skills:

Mathematical - Knowledge of basic information theory; probabilistic reasoning.

General skills - Modelling, problem solving and logical analysis Assimilation and use of complex and novel ideas

Teaching Information

Lectures. Exercises to be done by students, problem classes.

Assessment Information

100% Examination.

Raw scores on the examinations will be determined according to the marking scheme written on the examination paper. The marking scheme, indicating the maximum score per question, is a guide to the relative weighting of the questions. Raw scores are moderated as described in the Undergraduate Handbook.

Reading and References

There exist many textbooks on the elements of information theory. The course will follow such treatises only in the first half, and in the second presenting more modern material. Nevertheless, as a rigorous and affordable companion for the student, I can recommend

  • R B Ash. Information Theory, Dover Publications, 1990
  • T M Cover & J A Thomas. Elements of Information Theory, Wiley Interscience, 1991.

Other useful references are:

  • C E Shannon & W Weaver. The Mathematical Theory of Communication, University of Illinois Press, 1963.
  • I Csiszar & J Koerner. Information Theory: Coding Theorems for Discrete Memoryless Systems (2nd ed.), Akademiai Klado, Budapest, 1997.

The course only requires elementary probability theory, but students who have taken further probability will find some of the course content easier. A very good reference is

  • G R Grimmett & D. Welsh. Probability: An Introduction, Oxford University Press, 1986.

Additional reading for the probabilistic method:

  • N Alon & J H Spencer. The Probabilistic Method (2nd ed.), Wiley Interscience, 2000.

Feedback