To introduce the basic ideas and methods of Probability, developing the concepts of random variables, expectations and variances. To look at some simple applications of these ideas and methods.
Probability is an everyday concept of which most people have only a vague intuitive understanding. Study of games of chance, such as tossing dice and card games, resulted in early attempts to formalise the theory; but a satisfactory rigorous basis for the subject only came with the axiomatic theory of Kolmogorov in 1933. Today probability is a well established and actively researched area of mathematics with lively links to Analysis, Combinatorics, Functional Analysis, Game Theory, Geometry, Mathematical Physics, Statistics. It also serves as a very important basis which various disciplines build on (Biology, Computer Science, Economics, Engineering, Linguistics, Physics, Sociology, just to mention a few).
The unit starts with the idea of a probability space, which is how we model the outcome of a random experiment. Probability models are then introduced in terms of random variables (which are functions of the outcomes of a random experiment), and the simpler properties of standard discrete and continuous random variables are discussed. Motivation is given for studying the common quantities of interest (probabilities, expected values, variances and covariances). Finally techniques are developed for evaluating these quantities, including generating functions and conditional expectations.
Relation to other units
This unit provides the foundation for all probability and statistics units in later years.
When you have successfully completed this module you will be able to:
- Define events and sample spaces, describe them in simple examples, and use counting arguments to calculate probabilities when there are equally likely outcomes.
- List the axioms of probability, and use them to prove simple results, including the partition theorem and Bayes’ theorem.
- Define a random variable. Define the probability mass function for discrete random variables, and the probability density function (pdf) and cumulative distribution function (cdf) for continuous random variables. Illustrate links between the pdf and cdf. Calculate the pdf of a function of a random variable.
- Define the following random variables: Bernoulli, Binomial, Geometric, Poisson, Uniform, Exponential, Gamma, Normal/Gaussian. Recall and illustrate features of these distributions.
- Define and calculate the expectation, variance and covariance of simple random variables, including all of the standard types in the previous objective.
- Define jointly distributed random variables, joint probability mass functions.
- Define the moment generating function of a random variable. Use moment generating functions to analyse sums of random variables.
- Define and explain conditional expectation. Prove the double expectation formula. Use conditional expectation and moment generating functions to analyse random sums.
- Formulate formal probability models from informal descriptions.
Model building. Especially the formal mathematical modelling of informal descriptions of events and processes.
- Review of some elementary Combinatorics .
- Sample spaces; events; axioms of probability; simple results derived from the axioms .
- Combinatorial probability .
- Conditional probability; multiplications lemma; partition theorem; Bayes theorem, independent events .
- Discrete random variables (r.v.'s); probability mass function .
- Bernoulli, Binomial, Poisson and Geometric distribution; Poisson approximation to Binomial .
- Expectations of r.v.'s; expectations of a function of r.v.'s; variance of r.v.'s and standard deviation .
- Continuous random variables; distribution function; probability density function .
- Uniform, Exponential, Normal distributions (use of statistical tables); idea of the Central Limit Theorem; transformations .
- Bivariate distributions; joint, conditional and marginal distributions; independent random variables; discrete convolution .
- A continuous convolution example: the Gamma distribution .
- Properties of expectations (linear combinations, independent products) .
- Properties of variance and covariance (linear combinations, sums of independent r.v.'s, degenerate r.v.'s); correlation .
- Conditional expectation; partition theorem; formulae for E(X) in terms of E(X|Y); random sums .
- Moment generating functions; moments; linear combinations; applications to normal distributions; independent r.v.'s .
- Markov's and Chebyshev's inequality, Weak Law of Large Numbers, Central Limit Theorem .
Reading and References
The recommended text is:
- Probability 1 (compiled from the first 8 chapters of A First Course in Probability by S. Ross). Pearson Custom Publishing.
The Library has copies of A First Course in Probability by S. Ross.
An A in A-level Mathematics or equivalent.
Analysis 1A (or equivalently MATH 11006 Analysis 1) and Calculus 1, or equivalent.
Methods of teaching
Lectures supplemented (for first year students) by exercise classes and small group tutorials. Weekly problem sheets, with outline solutions handed out the following week.
Methods of Assessment
The pass mark for this unit is 40.
The final mark is calculated as follows:
- 90% from a 1 hour 30 minute exam in January*
- 10% from selected homework questions.
*There are two parts to the exam; Part A consists of 5 shorter questions, Part B consists of 2 longer questions. ALL questions will be used for assessment. Part A contributes 40% and Part B contributes 60% of the overall mark for the paper.
Calculators of an approved type (non-programmable, no text facility) are allowed.
For information resit arrangements, please see the re-sit page on the intranet.
Further exam information can be found on the Maths Intranet.