Statistical Laboratory                                                                                                      


M.Phil. in Statistical Science

Detailed information on course structure

The Statistical Laboratory, which is part of the Faculty of Mathematics of the University of Cambridge, is a historic and famous institution, which gained the top grade of 5* in each of the 1996 and 2001 Research Assessment Exercises.

Over the years we have trained many excellent statisticians, who now work in a diversity of fields such as finance, as actuaries, in scientific research institutes, for pharmaceutical companies, or as a professor or lecturer either in the UK or abroad. We continue to train such individuals, who are much in demand because of their particular blend of mathematical and practical skills.

We expect M.Phil applicants to hold a first class degree (or equivalent) in mathematics or mathematics/statistics. Applicants with different backgrounds are considered on an individual basis. The deadline for applications is 31st January. Our target is to continue to attract highly qualified students from the UK and overseas: experience shows that such students are highly employable.

Some of our M.Phil students go on to do a Ph.D. either in Cambridge or elsewhere. The decision as to whether an M.Phil. student is acceptable as a Ph.D. student is made on a case-by-case basis, as it depends on several factors, one of which is the existence of a suitable Ph.D. supervisor.

The M.Phil. course outline and structure is given below, and should prove attractive for students interested in various blends of applied probability, statistical theory and practice, and operations research: examples of possible student choices are given later on. Note that the M.Phil. course will be 10 months in duration and students submit the Applied Project after the written examinations.


MPhil Core Courses: 

Applied Statistics (24: M16, E8)

Statistical Theory (M16)

Introduction to Probability (M16)

Mathematics of Operational Research (M24)

 

MPhil Optional Courses:

Advanced Financial Models (M24)

Applied Bayesian Statistics (L16)

Biostatistics (L16)

Time Series and Monte Carlo Inference  (L24)

Nonparametric Statistical Theory (L16)

Actuarial Statistics (L16)

 

Other Statistical Laboratory Part III Courses: 

Stochastic Calculus and Applications (L24)

Stochastic Networks (L16)

Advanced Probability (M24)

Schramm-Loewner Evolutions (L16)

Quantum Information Theory (M24)


*  These two courses constitute the sixteen hour course in Biostatistics.

** These two courses constitute the twenty-four hour course in Time Series and Monte Carlo Inference.


Examinations

In late April/early May, students are asked to submit their examination entries to the Course Director. This is the point at which you must choose which courses to take for examination. You must include least three core courses and other courses up to 16, 17 or 18 units. Most students in the past have done 16 or 17 units. To calculate the units, note that each 8 hours of a lecture course counts for 1 unit.

The Applied Statistics course will be examined in a series of practical assignments, to be done in the students' own time over three and a half days in the week following the other written papers.

The examinations begin around the 29th May and run for approximately two weeks. A draft timetable is made available in the Michaelmas term.

Past examination papers are available to download here, the files are in ps and pdf format.

Projects

Each student must submit an Applied Project. Work on the project is done concurrently with lectures and students will need to spend a substantial part of each of the Christmas and Easter vacations working on this project. Summaries of past projects can be viewed here.

Other Information

The Course runs from 1 October to 31 July and provides excellent vocational training as well as preparation for a research degree. It involves taught courses, practical classes for statistical computing, and a substantial applied project.

The Diploma in Mathematical Statistics ran, as a 9-month course primarily for good mathematics graduates, since 1947. Although the Statistical Laboratory valued this historic title, it became rather archaic, and quite out of line with the degree title usually awarded for a modern graduate course, particularly for a course which has a reputation for being high-level and demanding.


Some possible student `tracks' under the M.Phil:

  1. John S. wants to become a professional statistician. He chooses the core courses in Applied Statistics, Probability and Statistical Theory. For his optional courses, he takes Monte Carlo Inference, Time Series and Biostatistics. His Applied Project is in Medical Statistics arranged with a local research institute.
  2. Francoise B.'s first degree in mathematics gave her a very good grounding in probability. She seeks a career in mathematical finance. Her chosen core courses are the Mathematics of Operations Research, Applied Statistics and Statistical Theory. In addition she takes Advanced Probability, Actuarial Statistics, Advanced Financial Models, and Stochastic Calculus and Applications. Her Applied Project is arranged with either a City bank or a large actuarial company.
  3. Nathan U., an overseas student, is seeking a course which will give him a broad background in statistics, probability and `applicable' mathematics. He takes all four core courses, and he selects two options in areas that have interested him. His Applied Project is provided by a University Department, say Land Economy, Geography or Economics.
  4. Elizabeth J. has a particular interest in the modelling of communication networks, and knows that after her M.Phil. she will seek employment in this field, possibly preceded by a Ph.D. She takes all four core courses, and chooses as her options Stochastic Networks and Advanced Financial Models. Her Applied Project is in Network Modelling, supervised by a member of the Laboratory using his contacts with industry.

Director of Studies
M.Phil in Statistical Science
Statistical Laboratory
Centre for Mathematical Sciences
Wilberforce Road
Cambridge CB3 0WB

Email: mphilenquiries[at]statslab.cam.ac.uk


Schedules of Lecture Courses 2009-2010

It is important to note that the follow list is correct for the current Academic Year (2009/2010) but is subject to revision from year to year.


Applied Statistics (M8+8, E4+4)

S.M. Pitts and B.D.M. Tom

This course will count as a 3 unit (24 lectures) course. There will be  8 lectures and 8 classes in the Michaelmas term followed by  4 lectures and 4 classes in the Easter term.

Introduction to Linux, R and S-Plus on the Statistical Laboratory  computing network. Use of LaTeX for report writing.

Exploratory data analysis, graphical summaries. Introduction to  non-parametric tests.

The essentials of Generalized Linear Modelling.

Linear regression, orthogonal polynomials, standard experimental designs:  factorial experiments and interpretation of interactions.

Regression diagnostics: residuals, leverages and other related plots.  Collinearity. Box-Cox transformations.

Discrete data analysis: binomial and Poisson regression.  Multi-way contingency tables.


Some special topics, e.g.  (i) use of the Akaike Information Criterion for model-search (ii) quasi-likelihood and over-dispersion.

 
Selected further topics,  e.g. generalized additive regression; methods for survival data analysis; methods for  time-series; disease progression models (msm); longitudinal data; density  estimation; EM algorithm; cost effectiveness.

The above methods will be put into practice via S-Plus or R.

In the practical classes, emphasis is placed on the importance of the clear presentation of the analysis, so that students are required to  submit written solutions to the lecturer.

Pre-requisite Mathematics

It is assumed that you will have done a basic statistics course (including t-tests, χ2-tests, F-tests).


Literature

Venables, W.N. and Ripley, B.D. (2002) Modern Applied Statistics with S. Springer-Verlag. 4th edition.

 

Introduction to Probability 

N.Berestycki (M16)

This course is intended to cover some of the basic probability and Markov chains which is used in other M.Phil courses. Much of the material is covered in some form in Cambridge undergraduate courses. The course will consist of the extensive study of some fundamental examples rather than a formal exposition of the theory. This course assumes only a good general background in mathematics, including an introductory course in probability. The following topics will be discussed:

  • Basic tools. Probability spaces, random variables. Distribution, discrete r.v.s, absolutely continuous r.v.s. Expectation, variance, Markov inequality, Chebyshev inequality. Independence. Joint distribution, conditional probability, conditional expectation. Characteristic functions.
  • Fundamental probability results. Laws of large numbers. Convergence of random variables, convergence of distributions. Central limit theorem.
  • Markov chains. Definition, examples. Classification of states, irreducibility. Recurrence and transience. Stationary distributions and long-term behaviour. Continuous time Markov processes. Poisson process, examples of queues.
  • Martingales. Definition, examples. Optional stopping, L2 convergence theorem.

Literature:

Williams, D. Probability with Martingales, Cambridge UP (1991)
Norris, J. R., Markov Chains
Grimmett, G. R. and Stirzaker D. R., Probability and Random Processes, Clarendon (1992)
Durrett R., Probability: Theory and Examples, Duxbury (1991)
Ross, S. A first course in Probability Theory, edited by Pearson. 

Mathematics of Operational Research (M24)

F.P. Kelly and N.S. Walton 

This course is accessible to a candidate with mathematical maturity who has no previous experience of operational research; however it is expected that most candidates will already have had exposure to some of the topics listed below.

Convexity, the supporting hyperplane theorem. Lagrangian sufficiency. Strong Lagrangian problems; sufficient conditions for convexity of the optimal value function.  [3]

 Linear programming: the simplex algorithm, duality, shadow prices. [3]

 Complexity of algorithms: typical and worst-case behaviour. *NP-completeness.* Exponential complexity of the simplex algorithm. Polynomial time algorithms for linear programming.[3]

Ford--Fulkerson algorithm; max-flow min-cut theorem. Minimal spanning trees, transportation algorithm, general circulation problems. Shortest and longest paths; critical paths; project cost-time functions.  [5]

 Integer programming and tree searching. The branch and bound method. The travelling salesman problem.[3]

Two-person zero-sum games. Cooperative and non-cooperative games. Nash equilibria. The core, nucleolus, Shapley value. Bargaining. Market games and oligopoly. Evolutionary games. Bidding and auctions. [7]

Level:
General.

Literature:
L. C.Thomas, Games, Theory and Application, Wiley, Chichester (1984).
M. S.Bazaran, J. J.Harvis and H. D.Shara'i,  Linear Programming and Network Flows, Wiley (1988).
D.Bertsimas and J. N.Tsitsiklis,  Introduction to Linear Optimization, Athena Scientific (1997). Undoubtedly the best book on the market.

Statistical Theory (M16)

R. J. Samworth

This is a course on parametric statistical theory that goes hand in hand with the Lent term course on nonparametric statistical theory. We begin by reviewing briefly some basic theory and models in statistical inference that motivate and facilitate the development, in the second chapter, of general inferential methods based on the likelihood function. Although these methods are usually perfectly adequate for relatively low-dimensional models, they can fail badly in high-dimensions ;  in particular, when the dimension of the parameter space (usually denoted p) is larger than the number of observations, n.  These  large p, small n problems occur in a very wide range of applications, from microarray experiments in biology to portfolio selection in finance, and are at the forefront of modern Statistics. In Chapter 3, we will outline some of the most important recent developments, though this remains a very active research area.

Basic theory and models: Review of linear models. Basic results from measure theory and probability, such as modes of convergence, convergence theorems, differentiation under an integral, stochastic order notation, Cochran’s theorem. [5]

 First-order likelihood theory: Likelihood and related quantities. Review of Wald, score, like-lihood ratio statistics and signed root versions, distribution theory in no nuisance parameter case. Generalised linear models. [4]

 High dimensional problems: Shrinkage. Ridge regression. Sparsity and traditional variable selection methods (e.g. AIC). Penalised likelihood, the LASSO and LARS algorithm, other penalty functions (e.g. SCAD). Multiple testing: Bonferroni correction. False discovery rate, Benjamini-Hochberg procedure. [7]

Pre-requisite Mathematics:

Basic familiarity with statistical inference, including point estimation and hypothesis testing, will be assumed. Part IID Principles of Statistics is recommended as background. A small amount of measure theory will be used in the course, though we will cover what we need as we go along. Thus previous familiarity with measure theory, while being a small bonus, is certainly not necessary.

Literature:

L. Pace and A. Salvan, Principles of Statistical Inference, World Scientific (1997).
T.A. Severini, Likelihood Methods in Statistics, Oxford University Press (2000).
A.C. Davison, Statistical Models, Cambridge University Press (2003).
G.A. Young and R.L. Smith, Essentials of Statistical Inference, Cambridge University, Press (2005).

 

MPhil Options

Actuarial Statistics (L16)

S.M . Pitts

This course provides an introduction to various topics in non-life insurance mathematics. These topics feature in the Institute and Faculty of Actuaries examinations, CT6 and ST3.

  1. Topics covered in lectures include
  2. Loss distributions
  3. Reinsurance
  4. Aggregate claims
  5. Ruin theory
  6. Credibility theory
  7. No claims discount systems

 
Prerequisite mathematics:

This course assumes an introductory probability course (including moment generating functions, probability generating functions, conditional  expectations and variances)  a statistics course (including maximum likelihood estimation, Bayesian methods) that you know what a Poisson process is that you have met discrete time finite statespace Markov chains

Backup to the lectures:

Lectures will be supplemented with examples sheets and examples classes.

Literature:

S. Asmussen  Ruin Probabilities. World Scientific, 2000.
C.D. Daykin, T. Pentikäinen and E. Pesonen,  Practical Risk Theory for Actuaries and Insurers. Chapman and Hall, 1993.
D.M. Dickson,  Insurance Risk and Ruin. CUP, 2005.
J. Grandell,  Aspects of Risk Theory. Springer, 1991.
T. Rolski, H. Schmidli, V. Schmidt and J. Teugels,  Stochastic Processes for Insurance and Finance. Wiley, 1999.

 

Advanced Financial Models (M24)

M. Tehranchi

  • This course is an introduction to financial mathematics, with a focus on the pricing and hedging of contingent claims.  It complements the material in Advanced Probability, Stochastic Calculus and Applications, and Optimal Investment.
  • One-period models. Arbitrage and equivalent martingale measures.  Attainable claims and market completeness. 
  • Multi-period discrete time models.  Filtrations and martingales. European and American claims. Optimal stopping.
  • Brownian motion and stochastic calculus.  Brownian motion and its quadratic variation.  Stochastic integration. Girsanov's theorem.  Itô's formula.  Martingale representation theorem.
  • Black--Scholes model and generalizations. Admissible strategies.  Equivalent martingale measures.  Pricing and hedging in Markovian models.  The Black--Scholes model.  Local and stochastic volatility models.
  • Interest rate models.  Short rates, forward rates, and bond prices.  Markovian short rate models. The Heath--Jarrow--Morton drift condition.

Pre-requisite Mathematics:

A knowledge of probability theory at the level of Part II Probability and Measure will be assumed.  Familiarity with Part II Stochastic Financial Models is helpful.

Lecture notes will be distributed.  Additionally, the following books may be helpful.

Literature:

Baxter, M. & Rennie, A. (1996)   Financial calculus: an introduction to derivative pricing.  Cambridge University Press
Duffie, D. (2001) Dynamic asset pricing theory. 3rd ed.  Princeton University Press
Karatzas, I. (1997) Lectures on the mathematics of finance.  American Mathematical Society
Lamberton, D. & Lapeyre, B. (1996)  Introduction to stochastic calculus applied to finance. Chapman & Hall
Shreve, S. (2005) Stochastic Calculus for Finance: Vol. 1 and 2. Springer-Finance

Biostatistics (L16)

This course consists of two components:  Survival Data and Statistics in Medical Practice.  Together these make up one 2 unit (16 lecture) course.  You must take both components together for the examination.  Survival Data has 10 lectures and 2 classes; Statistics in Medical Practice has 6 lectures and 1 class.

 

Survival Data (L10+2)

P. Treasure

6 Lectures on Fundamentals of Survival Analysis

 
Characteristics of survival data; censoring. Definition and properties of the survival function, hazard and integrated hazard.  Examples.

Review of inference using likelihood.  Estimation of survival function and hazard both parametrically and non-parametrically.

Explanatory variables: accelerated life and proportional hazards models.  Special case of two groups. Model checking using residuals.

4 Lectures on Current Topics in Survival Analysis

In recent years there have been lectures on: frailty, cure, relative survival, empirical likelihood, counting processes and multiple events.

2 Example Classes

Following immediately after the lectures, the example classes apply the lectured material to real survival analysis contexts and datasets.

1 Revison Class (2 hours)

The revision class takes place just before the examination period in the Easter Term.

Level:

General

 
Principal book:

D. R. Cox & D. Oakes,  Analysis of Survival Data, London: Chapman & Hall (1984).

Other books:

P. Armitage, J. N. S. Matthews  & G. Berry,  Statistical Methods in Medical Research (4th ed.),  Oxford: Blackwell (2001) [Chapter on Survival Analysis for preliminary reading].

M. K. B. Parmar & D. MachinSurvival Analysis: A Practical Approach (1995), Chichester:  John Wiley.

 

Statistics in Medical Practice (L6)   

S.Bird & V. Farewell  &  D. Spiegelhalter

Each lecture will be a self-contained study of a topic in biostatistics, which may include study design (including randomization and evaluation of interventions), meta-analysis, clinical trials, case-control studies, and institutional comparisons.  The relationship between the medical issue and the appropriate statistical theory will be illustrated.

Level:

General

Appropriate Books:

There are no appropriate books, but relevant medical papers will be made available beforehand for prior reading.  It would be very useful to have some familiarity with media coverage of medical stories involving statistical issues, e.g. from Behind the Headlines on the NHS Choices website:

http://www.nhs.uk/News/Pages/NewsIndex.aspx

 

Time Series and Monte Carlo Inference (L24)    

The course consists of two components: Time Series and Monte Carlo Inference.  Together these make up one 3 unit (24 lecture) course.  You must take the two components together for the examination.  Time Series has 8 lectures; Monte Carlo Inference has 16 lectures.

Time Series (L8)

R.B. Gramacy        

Time series analysis refers to problems in which observations are (usually) collected at regular time intervals and there are correlations among successive observations. Applications cover virtually all areas of Statistics but some of the most important include economic and financial time series, and many areas of environmental or ecological data. This course will cover some of the most important methods for dealing with these problems. Lectures will be supplemented with  R  examples and demonstrations, an examples sheet, and an examples class. The following topics will be covered [number of lectures indicated].

Basic concepts: the nature of time series, exploratory data analysis, autocorrelation, stationarity   [2]

Time domain: auto-regressive (AR) and moving average (MA) processes, partial autocorrelation, estimation, forecasting,  GARCH models [3]

Frequency domain: spectral methods, periodicity, periodogram [1]

Filtering: The Kalman filter and dynamic linear models, state space models, estimation, smoothing, forecasting  [2]

Pre-requisites:

This course assumes that you have attended introductory Probability and Statistics courses.  A familiarity (or willingness to learn how to use) a statistical programming package such as  R or Matlab would also be desirable.

Literature:

R.H. Shumway and D.S. Stoffer,  Time Series Analysis and Its Applications (with R examples), Springer Texts in Statistics, 2006.
P.J.Brockwell and R.A.Davis,  Introduction to Time Series and Forecasting. Springer, 1996.
M.Harrison and M.West,  Bayesian Forecasting and Dynamic  Models. Springer Series in Statistics, 1997.

 

Monte Carlo Inference (L16)

R.B. Gramacy

 Monte Carlo methods are concerned with the use of stochastic simulation techniques for statistical inference.  These have had an enormous impact on statistical practice over the last 20 years, due to the advent of modern computing architectures and programming languages.  This course covers the theory underlying some of these methods and illustrates how they can be implemented and applied in practice through many examples and code.  The following topics will be covered [number of lectures indicated].

Simulation: Generation of uniform deviates, and more general techniques of random variable generation including inversion, rejection and ratio-of-uniforms methods, squeezing and composition [3]

Monte-Carlo methods: The plug-in principle, variance reduction methods including importance sampling, control variates and antithetic variables, Monte Carlo tests, the jackknife and the bootstrap, and cross validation [4]

Markov chain Monte Carlo (MCMC) methods (for Bayesian inference): Gibbs sampling, data augmentation, Metropolis--Hastings algorithm, reversible jump MCMC, sequential importance sampling (a.k.a. particle filters), particle learning [5]

Methods for classical inference: Simulated annealing (SA), trans--dimensional SA, Expectation--Maximisation (EM) algorithm  and extensions  [4]

Pre-requisite Mathematics:

This course assumes that you have attended introductory Probability and Statistics courses.  A basic knowledge of Markov chains would be helpful.  Familiarity with a statistical programming package such as R or Matlab would also be desirable---a willingness to learn one essential.

 
Literature:

Gentle, J. E. Random Number Generation and Monte Carlo  Methods, Second Edition, Springer (2003)
Ripley, B. D. Stochastic Simulation, Wiley (1987)
Efron, B. and Tibshirani, R. J. An Introduction to the Bootstrap, Chapman and Hall (1993)
Gamerman, D.and Lopes, H.F. Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, Second Edition, Chapman and Hall  (2006)
Robert, C. P. and Casella, G. Monte Carlo Statistical Methods, Springer (1999)

Applied Bayesian Statistics (L11+5)

D. Spiegelhalter

This course will count as a 2-unit (16 lecture) course. There will be 11 lectures and five practical classes.

  •  Bayes theorem; principles of Bayesian reasoning
  •  Exact conjugate analysis
  •  Assessment of prior distributions
  •  Monte Carlo analysis
  •  Markov chain Monte Carlo methods
  •  Regression analysis (linear, glm, nonlinear)
  •  Model criticism and comparison
  •  Hierarchical models (glmms)

 
The practical classes will use WinBUGS.

Pre-requisite Mathematics:

This course assumes that students have attended and reasonably absorbed the Applied Statistics course in the Michaelmas Term. It will be helpful but not essential to attend the Monte Carlo Inference course in the Lent Term.  Full familiarity with properties and manipulations of  probability distributions will be assumed, including marginalisation, change of variable, Fisher information, iterated expectation, conditional independence, and so on.

Literature:

Spiegelhalter, D. J., Best, N. G., Lunn, D., and   Thomas, A. (2009) Bayesian Analysis using BUGS:   A Practical Introduction. Chapman and Hall.  
Gelman A., Carlin, J. B., Stern, H. S., and Rubin, D. B. (2003) Bayesian Data Analysis. 2nd Edition.Chapman and Hall.

Nonparametric Statistical Theory (L16)

R. Nickl

This course gives an introduction to the theory of statistical inference when the postulated model for the distribution of the observed random variables is infinite-dimensional. Fundamental problems such as the estimation of a distribution function, a density function or a regression function, and of statistically relevant functionals thereof, will be considered. The techniques involved cover a wide range of mathematics from probability theory to analysis and approximation theory. There are many deep and surprising mathematical results in this area, as well as several open problems.

  1. Classical Empirical Processes:  Empirical distribution function (limit theorems of Glivenko-Cantelli, Donsker, Kolmogorov-Smirnov, inequality of Dvoretzky-Kiefer-Wolfowitz), sample-quantiles, hazard rate function estimation, uniform laws of large numbers.
  2. Approximation of Functions: Regularization of a function by convolution with kernels, linear approximation in Hilbert spaces, basic wavelet theory.
  3. Density Estimation: Kernel-, Histogram- and Wavelet Estimators; optimality and minimax criteria; risk bounds; bias-variance-tradeoff, confidence bands.
  4. Nonparametric Regression: Kernel regression, natural cubic splines, wavelet regression, asymptotic bias and variance approximations.
  5. V. Choice of Bandwidth/Resolution Level: Bandwidth selection, Cross-validation, wavelet thresholding procedures, model selection.
  6. Functional Delta Method and Applications: Functional estimation, Plug-in procedures.

Pre-requisite Mathematics:

Basic knowledge in Probabiliy, Statistics and Analysis is necessary. It is useful (but not necessary) to have knowledge of measure-theoretic probabiliy and linear/functional analysis, and the Michaelmas Part III course on Statistical Theory is recommended.

Literature:

Dudley, R. M. (2002). Real analysis and probability, CUP
Tsybakov, A.B. (2008). Introduction to nonparametric estimation, Springer.
van der Vaart, A.W. (1998), Asymptotic statistics, CUP.
Wasserman, L. (2006), All of nonparametric statistics, Springer.

Other Statistical Laboratory Part III Courses

Advanced Probability (M24)

I. Bailleul

 This course aims to cover some advanced topics at the heart of research in probability theory, with an emphasis on the tools needed for the analysis of stochastic processes like Brownian motion.

It will be assumed that students have some familiarity with the measure theoretic formulation of probability theory, at the level of the Part II(B) course Probability and Measure, or part A of D. Williams' book.

1.    `Static' theory of Stochastic Processes

  • Measure theoretic tools for construction of probabilities: construction of measures, product measures, Kolmogorov's criterion for continuity.
  •  Weak convergence in separable Banach spaces:

a) Finite dimensional theory (characteristic functions, L\'evy's continuity theorem).

b) Infinite dimensional theory: Prohorov's theorem, couplings.

c) Applications: construction of Wiener measure, Donsker's invariance principle.

 2.    Dynamic theory of Stochastic Processes

  • Conditional expectation (application to sufficient statistics).
  • Dynamics and filtrations, stopping times
  • Martingales: ) Discrete time theory, b) Applications: Radon-Nikodym theorem, changes of measure, a glimpse at concentration of measure phenomenon, c) Continuous time theory.


3.    Brownian motion and Lévy processes

  • Distributional and sample paths properties of Brownian motion.
  • Martingales and strong Markov property (application to Dirichlet problem).
  •  vy processes: integral with respect to a random Poisson measure, Lévy-Khinchin structure theorem.


Appropriate Books:

L.C.G. Rogers and D. Williams, Diffusions, Markov processes and Martingales, Vol. 1 (2nd edition). Chapters I and II. Wiley 1994
K.L. Chung, Green, Brown and Probability & Brownian motion on the line. World Scientific 2002.
D. Williams, Probability with Martingales. C.U.P. 1991
O. Kallenberg, Foundations of Modern Probability. Chapters 1-3, 6, 7, 16. Springer 1997
(...) and many others which I will mention along the course.

 

Stochastic Calculus and Applications (M24) 

N. Berestycki

The main goal of this course is the study of stochastic processes with a continuous time variable, that is, processes whose evolution involves randomness at every instant. The most basic such object is Brownian motion. After a brief review of some of the key properties of a Brownian sample path, we will construct a stochastic integral with respect to continuous martingales (even though typically they are almost surely nowhere differentiable), and discuss Itô's formula. This fundamental result is a stochastic calculus which explains among other things how a Brownian path is perturbed under a C2 application. We will then apply this powerful theory to the study of stochastic differential equations, and show how these may arise as approximations of Markov chains.

 Brownian motion. Hölder exponents, strong Markov property, law of the iterated logarithm.

Continuous Stochastic Calculus. Martingales and local martingales. Finite and quadratic variation. L2 theory and stochastic integration with respect to a continuous semimartingale. Itô's formula. Applications to Brownian motion: transience, recurrence, conformal invariance, Dirichlet problem, Girsanov's theorem.

 Stochastic Differential Equations. Existence and uniqueness. Relation to second order partial differential equations. Diffusions, martingale problem, Stroock-Varadhan theory on diffusion approximation (including Donsker's theorem).

Pre-requisite Mathematics:

Those attending this course will normally also attend the Part III course Advanced Probability, which covers all the prerequisite material. A prior acquaintance with discrete martingale theory is highly desirable, as given for instance, in Durrett [2], chapters 4 and 5. Knowledge of the definition and of some basic properties of Brownian motion will be helpful.

Level:

Additional.

Literature:

L.C. Rogers and D. Williams. Diffusions, Markov processes and martingales. Vol.1 and 2. Cambridge University Press, second edition, 2000.
R. DurrettProbability: theory and examples. Duxbury Press, 3rd edition.
D. Revuz and M. YorContinuous martingales and Brownian motion. Grundlehren der mathematischen Wissenschaften, Vol. 293, Springer. 3rd edition, 1999.

 

Quantum Information Theory (M24)

Dr N.Datta

Quantum Information Theory (QIT) is an exciting, young field which lies at the intersection of Mathematics, Physics and Computer Science. It was born out of Classical Information Theory, which is the mathematical theory of acquisition, storage, transmission and processing of information. QIT is the study of how these tasks can be accomplished, using quantum-mechanical systems. The underlying quantum mechanics leads to some distinctively new features which have no classical analogues. These new features can be exploited, not only to improve the performance of certain information-processing tasks, but also to accomplish tasks which are impossible or intractable in the classical realm.

This is an introductory course on QIT, which should serve to pave the way for more advanced topics in this field. The course will start with a short introduction to some of the basic concepts and tools of Classical Information Theory, which will prove useful in the study of QIT. Topics in this part of the course will include a brief discussion of data compression, transmission of data through noisy channels, Shannon.s theorems, entropy and channel capacity.

The quantum part of the course will commence with a study of open systems and a discussion of how they necessitate a generalization of the basic postulates of quantum mechanics. Topics will include quantum states, quantum operations, generalized measurements, POVMs and the Kraus Representation Theorem. Entanglement and some applications elucidating its usefulness as a resource in QIT will be discussed. The concept of decoherence and error correction will be introduced and various examples of quantum-error correcting codes will be discussed. This will be followed by a study of the von Neumann entropy, its properties and its interpretation as the data compression limit of a quantum information source. Schumacher.s theorem will be discussed in detail. The definition of ensemble average fidelity and entanglement fidelity will be introduced in this context. Various examples of quantum channels will be given and the different capacities of a quantum channel will be discussed. The Holevo bound on the accessible information and the Holevo-Schumacher-Westmoreland (HSW) Theorem will also be covered.

Desirable Previous Knowledge:

Knowledge of basic quantum mechanics will be assumed. However, an additional lecture can be arranged for students who do not have the necessary background in quantum mechanics. (This has been done in previous years.) Elementary knowledge of Probability Theory, Vector Spaces and Group Theory will be useful.

Introductory Reading:

The following book and lecture notes provide some interesting and relevant introductory reading material.

M.A.Nielsen and I.L.Chuang, Quantum Computation and Quantum Information; Cambridge University Press, 2000.
J.Preskill, Chapter 5 of his lecture notes: Lecture notes on Quantum Information Theory http://www.theory.caltech.edu/~preskill/ph229/lecture

Reading to complement course material Printed lecture notes will be distributed at the end of each lecture. These will, however, not necessarily contain all the important details of arguments and calculations covered in the lectures. Further references will be given in the course of the lectures.


Stochastic Networks (L16)

N. Walton

As communication networks have grown and the demand for their resources have increased it has become necessary to design simple decentralised mechanisms to control such networks. With this in mind we study congestion phenomenon in various queueing networks. This study leads us to consider a number of optimisation descriptions associated with such networks.

In the first part of the course we consider various classical models of queues and queueing networks and we prove reversibility, insensitivity and product form results.

In the second part of this course we study loss networks, a model of a telephone network. We study the precision of the Erlang Fixed Point Approximation, by considering the asymptotic behaviour of loss networks and we also begin to draw on analogies with electrical networks.

In the final part of the course we study Internet congestion control. We discuss the Transmission Control Protocol, utility optimisation and the stability of models of Internet congestion control. Finally drawing from the first and second parts of the course, we discuss the asymptotic behaviour of multi-class queueing networks.
 

Pre-requisite Mathematics:

Mathematics that will be assumed to be known before the start of the course: Markov Chains and Optimization at about the level of the Cambridge IB course. Acquaintance with the ideas in the Part II Applied Probability would be useful, but is not assumed.

Literature:

B. Hajek, Analysis of Communication Networks. http://www.ifp.uiuc.edu/~hajek/
P. Robert, Stochastic Networks and Queues. Springer-Verlag, 2003. Chapter 4.
H. Chen and D. D. Yao, Fundamentals of Queueing Networks. Springer-Verlag, 2001.
S. Asmussen, Applied Probability and Queues - second edition. Springer-Verlag, 2003.
R. Srikant, The Mathematics of Internet Congestion Control. Birkhauser, 2004.

Schramm-Loewner Evolutions (L16) 

J.R. Norris

Stochastic Loewner Evolution (SLE) was discovered by Oded Schramm in 1999.  It provides a family of continuum models, depending on a parameter κ, various intances of which are believed, and in some cases known, to arise as limits for certain planar lattice-based models in statistical physics..

The course will focus on the continuum models alone. The basic properties of SLE will be explored for a number of key choices of κ.  SLE is a generalization of Loewner's (non-stochastic) evolution of conformal maps, but now driven by a Brownian motion of diffusivity κ. So the fundamentals of conformal mapping will be needed, though most of this will be developed as required.  A basic familiarity with Brownian motion and Itô calculus will be assumed.

The course material will be based on a selection from the accounts of Lawler and Werner listed below. 

Literature:

W. Werner,  Random planar curves and Schramm--Loewner evolutions, math.PR/0303354, 2003.
G. F. Lawler,  Conformally Invariant Processes in the  Plane, AMS, 2005.

Concentration of Measure  (M16)

Non-Examinable (Graduate Level)

N. Berestycki and R. Nickl

The concentration of measure phenomenon was first put forward in the 70s and 80s in geometric functional analysis by Milman, Gromov, Schechtman, and has been subject to fascinating recent developments in probability theory, mostly due to M. Talagrand (1995, 1996ab) and M. Ledoux (2001). Very roughly speaking, this phenomenon can be stated in the following simple way: ``a random variable that depends in a smooth way on many independent random variables (but not too much on any of them), is essentially constant." Of course, the precise meaning of such a statement needs to be clarified, but often it will mean that the random variable X concentrates around a constant c in such a way that the probability of the event |X-c|>t is exponentially small in t. This type of bounds is usually referred to as a concentration inequality.

The aim of this course is to investigate the basic mathematical principles behind the concentration of measure phenomenon (e.g., Talagrand's celebrated inequality), and show its relevance to several different areas of mathematics through the following examples

  • classical deviation inequalities of probability theory   
  • classical inequalities for Gaussian measures in abstract spaces (e.g., Borell (1975))
  • infinite product measures, leading to sharp concentration inequalities for empirical processes and sums of i.i.d. Banach-space valued random variables, including some recent applications to statistics
  • applications to probability and mathematical physics, such as shape fluctuations in first and last percolation on graphs, concentration in the Sherrington-Kirkpatrick spin glass model, and random matrix theory.

 

Desirable Previous Knowledge:

We shall assume some basic notions of probability and measure theory. This being a non-examinable course, we plan to make this as informal and accessible as possible, in the style of a reading seminar.

Introductory Reading:

I. Benjamini, G. Kalai, and O. Schramm (2003). First passage percolation has sublinear distance varianceAnn. Probab., 31, 1970--1978
C. Borell (1975). The Brunn-Minkowski inequality in Gauss space. Invent. Math. 30, 207--21
M. Ledoux (2001). The concentration of measure phenomenon. AMS Monographs, Providence.
M. Talagrand (1995). Concentration of measure and isoperimetric inequalities in product spaces. Inst. Hautes Études Sci. Publ. Math. No. 81 (1995), 73--205
M. Talagrand (1996). A new look at independence. Ann. Probab. 24, 1--34.
Talagrand, M. (1996). New concentration inequalities in product spaces. phInvent. Math. 125, 505--563