You may contact a Proposer directly about a specific project or contact the Postgraduate Admissions Secretary with general enquiries.
Title  Network performance subject to agentbased dynamical processes 

Group(s)  Industrial and Applied Mathematics, Statistics and Probability 
Proposer(s)  Dr Keith Hopcraft, Dr Simon Preston 
Description  Networks – systems of interconnected elements – form structures through which information or matter is conveyed from one part of an entity to another, and between autonomous units. The form, function and evolution of such systems are affected by interactions between their constituent parts, and perturbations from an external environment. The challenge in all application areas is to model effectively these interactions which occur on different spatial and timescales, and to discover how i) the microdynamics of the components influence the evolutionary structure of the network, and ii) the network is affected by the external environment(s) in which it is embedded. Activity in nonevolving networks is well characterized as having diffusive properties if the network is isolated from the outside world, or ballistic qualities if influenced by the external environment. However, the robustness of these characteristics in evolving networks is not as well understood. The projects will investigate the circumstances in which memory can affect the structural evolution of a network and its consequent ability to function. Agents in a network will be assigned an adaptive profile of goal and costrelated criteria that govern their response to ambitions and stimuli. An agent then has a memory of its past behaviour and can thereby form a strategy for future actions and reactions. This presents an ability to generate ‘lumpiness’ or granularity in a network’s spatial structure and ‘burstiness’ in its time evolution, and these will affect its ability to react effectively to external shocks to the system. The ability of externally introduced activists to change a network’s structure and function  or agonists to test its resilience to attack  will be investigated using the models. The project will use data of real agent’s behaviour. 
Relevant Publications 

Other information 
Title  Fluctuation Driven Network Evolution 

Group(s)  Industrial and Applied Mathematics, Statistics and Probability 
Proposer(s)  Dr Keith Hopcraft, Dr Simon Preston 
Description  A network’s growth and reorganisation affects its functioning and is contingent upon the relative timescales of the dynamics that occur on it. Dynamical timescales that are short compared with those characterizing the network’s evolution enable collectives to form since each element remains connected with others in spite of external or internally generated ‘shocks’ or fluctuations. This can lead to manifestations such as synchronicity or epidemics. When the network topology and dynamics evolve on similar timescales, a ‘plastic’ state can emerge where form and function become entwined. The interplay between fluctuation, form and function will be investigated with an aim to disentangle the effects of structural change from other dynamics and identify robust characteristics. 
Relevant Publications 

Other information 
Title  Numerical methods for stochastic partial differential equations 

Group(s)  Scientific Computation, Statistics and Probability 
Proposer(s)  Prof Michael Tretyakov 
Description  Numerics for stochastic partial differential equations (SPDEs) is one of the central topics in modern numerical analysis. It is motivated both by applications and theoretical study. SPDEs essentially originated from the filtering theory and now they are also widely used in modelling spatially distributed systems from physics, chemistry, biology and finance acting in the presence of fluctuations. The primary objectives of this project include construction, analysis and testing of new numerical methods for SPDEs.

Relevant Publications 

Other information 
Title  Property Prediction of Composite Components Prior to Production 

Group(s)  Scientific Computation, Statistics and Probability 
Proposer(s)  Prof Michael Tretyakov, Prof Frank Ball 
Description  Property Prediction of Composite Components Prior to Production Supervisors: Dr Frank Gommer^{1*}, Prof Michael Tretyakov^{2*}, Prof Frank Ball^{2} , Dr Louise P. Brown^{1 } University of Nottingham, University Park, Nottingham NG7 2RD, UK ^{1} Polymer Composites Group, Faculty of Engineering ^{2} School of Mathematical Sciences ^{*} Contact: F.Gommer@nottingham.ac.uk or Michael.Tretyakov@nottingham.ac.uk
This is an exciting opportunity for a postgraduate student to join a vibrant interdisciplinary team and to work in the modern area of Uncertainty Quantification. Fibre reinforced composites are increasingly used in the transport industry to decrease the structural weight of a vehicle and thus increase its fuel efficiency. The importance of the UK composite sector is reflected in the current growth rate of 17% pa for high performance composite components and the expected gross value of £2 billion in 2015 [1]. However, due to the large number of production steps and the necessary saturation of the fibre preform with a resin matrix, a significant amount of waste is produced, which may range between 2% and 20% of the production volume [2]. A major cause of rejecting parts is variability in the reinforcement, such as varying yarn spacing and yarn path waviness, which can significantly influence subsequent properties. For example, these variabilities can affect resin flow and may cause dry spots or reduce mechanical properties. This PhD project will enable the successful candidate to work at the forefront of material science, combining engineering standards, applied mathematics and statistics, with a potential of making an impact on the way of manufacturing composite parts in the future. This proposed doctoral study aims to demonstrate that properties of lightweight fibre reinforced plastics can be predicted in real time before a part is actually manufactured. Data gained from images taken of each layer of a composite during the stacking process are used to determine local geometries and variabilities, within and inbetween individual layers [3]. For example, based on the measured textile geometries it will be possible to predict the resin flow within a preform during a liquid composite moulding (LCM) process considering individual variabilities before injection. These specific flow predictions will allow adjustments of the process parameters during the impregnation process to ensure full saturation of the entire preform with a liquid resin matrix. This will be especially useful when a number of inlet and outlet ports are present such as in the case of complex or large parts. The formation of dry spots will be avoided, which will reduce immediate wastage. For these predictions, faster solutions than currently available are necessary. To find such solutions, appropriate advanced statistical techniques and stochastic modelling for quantifying uncertainties in composites production will be developed in the course of the PhD project. In addition, the developed techniques will also allow virtual testing of a finished component with its specific inherent reinforcement variability. This will make it feasible to customise predictions for every fabricated component. In combination with continuous health monitoring of a structure, it may be possible to estimate the influence of loading conditions, load cycles and damage evaluation. This will also make it possible to predict an individual life expectancy of a part in service. These data can then be used to determine customised inspection intervals for each component. We require an enthusiastic graduate with a 1st class degree in Mathematics or Engineering, preferably of the MMath/MSc level, with good programming skills and willing to work as a part of an interdisciplinary team. A candidate with a solid background in statistics will have an advantage. References [1] CompositesUK. www.compositesuk.co.uk/Information/FAQs/UKMarketValues.aspx. [2] A. C. Long, Design and Manufacture of Textile Composites: Woodhead Publ, 2005. [3] F. Gommer, L. P. Brown, and R. Brooks, “Quantification of mesoscale variability and geometrical reconstruction of a textile”, submitted to Compos Part AAppl S, 2015. 
Relevant Publications 

Other information  This project is supported by EPSRC DTG Centre in Complex Systems and Processes, see elligibility and how to apply at http://www.nottingham.ac.uk/complexsystems/index.aspx 
Title  Uncertainty quantification for evolutionary PDEs 

Group(s)  Scientific Computation, Statistics and Probability 
Proposer(s)  Dr Kris van der Zee 
Description  Uncertainty quantification for evolutionary PDEs The field of uncertainty quantification (UQ), as applied to partial differential equations (PDEs), provides a means for understanding the effect of uncertainty in the parameters on output quantities of interest. This is, for example, extremely important in evolutionary models that describe the degradation of nuclear waste containment systems: A small uncertainty in the parameters may have a large influence on the degradation, and thus may result in unexpected damaged systems that leak nuclear waste. Fortunately, observational data can help reduce uncertainty in models. The models can then become predictive and allow for an assessment of its true future behavior! Challenges for students: Depending on the interest of the student, several of these issues (or others) can be addressed. 
Relevant Publications 

Other information 

Title  Index policies for stochastic optimal control 

Group(s)  Statistics and Probability 
Proposer(s)  Dr David Hodge 
Description  Since the discovery of Gittins indices in the 1970s for solving multiarmed bandit processes the pursuit of optimal policies for this very wide class of stochastic decision processes has been seen in a new light. Particular interest exists in the study of multiarmed bandits as problems of optimal allocation of resources (e.g. trucks, manpower, money) to be shared between competing projects. Another area of interest would be the theoretical analysis of computational methods (for example, approximative dynamic programming) which are coming to the fore with ever advancing computer power.

Relevant Publications 

Other information  Keywords: multiarmed bandits, dynamic programming, Markov decision processes 
Title  Uncertainty quantification in palaeoclimate reconstruction 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Richard Wilkinson 
Description  The climate evolves slowly. Even if we stopped emitting greenhouses gases today, we wouldn't see the full effect of the damage already done for at least another 100 years. The instrumental record of past climate and weather goes back at most 300 years, and before then we have to rely on indirect (and inaccurate) data sources. Because of the slow evolution of the climate, this is like only having a very small number of acccurate observations, and so consquently we have very little information that can be used to assess the accuracy of climate simulators, which are the key tool used for predicting how the climate will behave in the future. An important source of information on what the climate was like in the past comes from proxy data sources such as pollen taken from lake deposits, or measurements of the aircontent (specifically the ratio of oxygen18 to oxygen16) stored in glaciers hundreds of thousands of years ago. Reconstructing past climate from these data sources is a difficult task as the measurements are noisy, correlated, and don't have accurate dates attached to them, yet the task is important if we are to understand how the climate evolves and hence be able to predict the future. In this project, we will look at statistical methods for accurate palaeoclimate reconstruction, and aim to provide believeable uncertainty quantifications that accurately represent our degree of confidence/ignorance about what the climate was like in the past. The complex nature of the problem means that it is likely that stateoftheart Monte Carlo methods will be needed, as well as potentially developing new methods in order to do the inference.

Relevant Publications 

Other information 
Title  SemiParametric Time Series Modelling Using Latent Branching Trees 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Theodore Kypraios 
Description  A class of semiparametric discrete time series models of infinite order where we are be able to specify the marginal distribution of the observations in advance and then build their dependence structure around them can be constructed via an artificial process, termed as Latent Branching Tree (LBT). Such a class of models can be very useful in cases where data are collected over long period and it might be relatively easy to indicate their marginal distribution but much harder to infer about their correlation structure. The project is concerned with the development of such models in continuoustime as well as developing efficient methods for making Bayesian inference for the latent structure as well as the model parameters. Moreover, the application of such models to real data would be also of great interest. 
Relevant Publications 

Other information 
Title  Bayesian methods for analysing computer experiments 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Richard Wilkinson 
Description  Computer experiments (ie simulators) are used in nearly all areas of science and engineering. The statistical analysis of computer experiments is an exciting and rapidly growing area of statistics which looks at the question of how best to learn from computer models. Examples of the types of challenges faced, and possible areas for a Ph.D, are given below. (i) Computer models are often process models where the likelihood function is intractable (as is common in genetics, ecology, epidemiology etc) and so to do inference we have to use likelihoodfree techniques. Approximate Bayesian computation (ABC) methods are a new class of Monte Carlo methods that are becoming increasingly popular with practitioners, but which are largely unstudied by statisticians, and there remains many open questions about their performance. Application areas which use ABC methods are mostly biological (genetics and ecology in particular), but their use is growing across a wide range of fields. (ii) Expensive simulators which take a considerable amount of time to run (eg, climate models), present the challenge of how to learn about the model (its parameters, validity, or its predictions etc.) when we only have a limited number of model evaluations available for use. A statistical tool developed in the last decade is the idea of building statistical emulators of the simulator. Emulators are cheap statistical models of the simulator (models of the model) which can be used in place of the simulator to make inferences, and are now regularly used in complex modelling situations such as in climate science. However, there are still many questions to be answered about how best to build and then use emulators. Possible application areas for these methods include climate science, and engineering (such as groundwater flow problems for radioactive waste), as well as many others. (iii) "All models are wrong, but some are useful"  In order to move from making statements about a model to making statements about the system the model was designed to represent, we must carefully quantify the model error  interest lies in what will actually happen, rather than in what your model says will happen! Failure to account for model errors can mean that different models of the same system can give different predictions (see for example the controversy regarding the differing predictions of the large climate models  none of which account for model error!). Assessing and incorporating model error is a new and rapidly growing idea in statistics, and is done by a combination of subjective judgement and statistical learning from data. The range of potential application areas is very wide, but in particular meterology and mechanical engineering are areas where these methods are needed. 
Relevant Publications 

Other information  See http://www.maths.nottingham.ac.uk/personal/pmzrdw/ for more information. 
Title  Ion channel modelling 

Group(s)  Statistics and Probability 
Proposer(s)  Prof Frank Ball 
Description  The 1991 Nobel Prize for Medicine was awarded to Sakmann and Neher for developing a method of recording the current flowing across a single ion channel. Ion channels are protein molecules that span cell membranes. In certain conformations they form pores allowing current to pass across the membrane. They are a fundamental part of the nervous system. Mathematically, a single channel is usually modelled by a continuous time Markov chain. The complete process is unobservable but rather the state space is partitioned into two classes, corresponding to the receptor channel being open or closed, and it is only possible to observe which class of state the process is in. The aim of single channel analysis is to draw inferences about the underlying process from the observed aggregated process. Further complications include (a) the failure to detect brief events and (b) the presence of (possibly interacting) multiple channels. Possible projects include the development and implementation of Markov chain Monte Carlo methods for inferences for ion channel data, Laplace transform based inference for ion channel data and the development and analysis of models for interacting multiple channels. 
Relevant Publications 

Other information 
Title  Optimal control in yield management 

Group(s)  Statistics and Probability 
Proposer(s)  Dr David Hodge 
Description  Serious mathematics studying the maximization of revenue from the control of price and availability of products has been a lucrative area in the airline industry since the 1960s. It is particularly visible nowadays in the seemingly incomprehensible price fluctuations of airline tickets. Many multinational companies selling perishable assets to mass markets now have large Operations Research departments inhouse for this very purpose. This project would be working studying possible innovations and existing practices in areas such as: customer acceptance control, dynamic pricing control and choicebased revenue management. Applications to social welfare maximization, away from pure monetary objectives, and the resulting game theoretic problems are also topical in home energy consumption and mass online interactions. 
Relevant Publications 

Other information 
Title  Stochastic Processes on Manifolds 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Huiling Le 
Description  As well as having a wide range of direct applications to physics, economics, etc, diffusion theory is a valuable tool for the study of the existence and characterisation of solutions of partial differential equations and for some major theoretical results in differential geometry, such as the 'Index Theorem', previously proved by totally different means. The problems which arise in all these subjects require the study of processes not only on flat spaces but also on curved spaces or manifolds. This project will investigate the interaction between the geometric structure of manifolds and the behaviour of stochastic processes, such as diffusions and martingales, upon them. 
Relevant Publications 

Other information 
Title  Statistical Theory of Shape 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Huiling Le 
Description  Devising a natural measure between any two fossil specimens of a particular genus, assessing the significance of observed 'collinearities' of standing stones and matching the observed systems of cosmic 'voids' with the cells of given tessellations of 3spaces are all questions about shape. It is not appropriate however to think of 'shapes' as points on a line or even in a euclidean space. They lie in their own particular spaces, most of which have not arisen before in any context. PhD projects in this area will study these spaces and related probabilistic issues and develop for them a revised version of multidimensional statistics which takes into account their peculiar properties. This is a multidisciplinary area of research which has only become very active recently. Nottingham is one of only a handful of departments at which it is active. 
Relevant Publications 

Other information 
Title  Automated tracking and behaviour analysis 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Christopher Brignell 
Description  In collaboration with the Schools of Computer and Veterinary Science we are developing an automated visual surveillance system capable of identifying, tracking and recording the exact movements of multiple animals or people. The resulting data can be analysed and used as an early warning system in order to detect illness or abnormal behaviour. The threedimensional targets are, however, viewed in a two dimensional image and statistical shape analysis techniques need to be adapted to improve the identification of an individual's location and orientation and to develop automatic tests for detecting specific events or individuals not following normal behaviour patterns. 
Relevant Publications 

Other information 
Title  Asymptotic techniques in Statistics 

Group(s)  Statistics and Probability 
Proposer(s)  Prof Andrew Wood 
Description  Asymptotic approximations are very widely used in statistical practice. For example, the largesample likelihood ratio test is an asymptotic approximation based on the central limit theorem. In general, asymptotic techniques play two main roles in statistics: (i) to improve understanding of the practical performance of statistics procedures, and to provide insight into why some proceedures perform better than others; and (ii) to motive new and improved approximations. Some possible topics for a Ph.D. are

Relevant Publications 

Other information 
Title  Statistical Inference for Ordinary Differential Equations 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Theodore Kypraios, Dr Simon Preston, Prof Andrew Wood 
Description  Ordinary differential equations (ODE) models are widely used in a variety of scientific fields, such as physics, chemistry and biology. For ODE models, an important question is how best to estimate the model parameters given experimental data. The common (nonlinear least squares) approach is to search parameter space for parameter values that minimise the sum of squared differences between the model solution and the experimental data. However, this requires repeated numerical solution of the ODEs and thus is computationally expensive; furthermore, the optimisation's objective function is often highly multimodal making it difficult to find the global optimum. In this project we will develop computationally less demanding likelihoodbased methods, specifically by using spline regression techniques that will reduce (or eliminate entirely) the need to solve numerically the ODEs. 
Relevant Publications 

Other information 
Title  Bayesian approaches in palaeontology 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Richard Wilkinson 
Description  Palaeontology provides a challenging source of problems for statisticians, as fossil data are usually sparse and noisy. Methods from statistics can be used to help answer scientific questions such as when did species originate or become extinct, and how diverse was a particular taxonomic group. Some of these questions are of great scientific interest  for example  did primates coexist with dinosaurs in the Cretaceous? There is no hard evidence either way, but statistical methods can be used to assess the probability that they did coexist. This project will involve building a stochastic forwards model of an evolutionary scenario, and then fitting this model to fossil data. Quantifying different sources of uncertainty is likely to play a key part in the analysis. 
Relevant Publications 

Other information  See http://www.maths.nottingham.ac.uk/personal/pmzrdw/ for more information 
Title  Statistical shape analysis with applications in structural bioinformatics 

Group(s)  Statistics and Probability 
Proposer(s)  Dr Christopher Fallaize 
Description  In statistical shape analysis, objects are often represented by a configuration of landmarks, and in order to compare the shapes of objects, their configurations must first be aligned as closely as possible. When the landmarks are unlabelled (that is, the correspondence between landmarks on different objects is unknown) the problem becomes much more challenging, since both the correspondence and alignment parameters need to be inferred simultaneously. 
Relevant Publications 

Other information 
Title  Highdimensional molecular shape analysis 

Group(s)  Statistics and Probability 
Proposer(s)  Prof Ian Dryden 
Description  In many application areas it is of interest to compare objects 
Relevant Publications 

Other information 
Title  Statistical analysis of neuroimaging data 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Dr Christopher Brignell 
Description  The activity of neurons within the brain can be detected by function magnetic resonance imaging (fMRI) and magnetoencephalography (MEG). The techniques record observations up to 1000 times a second on a 3D grid of points separated by 110 millimetres. The data is therefore highdimensional and highly correlated in space and time. The challenge is to infer the location, direction and strength of significant underlying brain activity amongst confounding effects from movement and background noise levels. Further, we need to identify neural activity that are statistically significant across individuals which is problematic because the number of subjects tested in neuroimaging studies is typically quite small and the intersubject variability in anatomical and functional brain structures is quite large. 
Relevant Publications 

Other information 
Title  Identifying fibrosis in lung images 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Dr Christopher Brignell 
Description  Many forms of lung disease are characterised by excess fibrous tissue developing in the lungs. Fibrosis is currently diagnosed by human inspection of CT scans of the affected lung regions. This project will develop statistical techniques for objectively assessing the presence and extent of lung fibrosis, with the aim of identifying key factors which determine longterm prognosis. The project will involve developing statistical models of lung shape, to perform object recognition, and lung texture, to classify healthy and abnormal tissue. Clinical support and data for this project will be provided by the School of Community Health Sciences. 
Relevant Publications 

Other information 
Title  Modelling hospital superbugs 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Philip O'Neill 
Description  The spread of socalled superbugs such as MRSA within healthcare settings provides one of the major challenges to patient welfare within the UK. However, many basic questions regarding the transmission and control of such pathogens remain unanswered. This project involves stochastic modelling and data analysis using highly detailed data sets from studies carried out in hospital, addressing issues such as the effectiveness of patient isolation, the impact of different antibiotics, and the way in which different strains interact with each other. 
Relevant Publications 

Other information 
Title  Modelling of Emerging Diseases 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Frank Ball 
Description  When new infections emerge in populations (e.g. SARS; new strains of influenza), no vaccine is available and other control measures must be adopted. This project is concerned with addressing questions of interest in this context, e.g. What are the most effective control measures? How can they be assessed? The project involves the development and analysis of new classes of stochastic models, including intervention models, appropriate for the early stages of an emerging disease. 
Relevant Publications 

Other information 
Title  StructuredPopulation Epidemic Models 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Frank Ball 
Description  The structure of the underlying population usually has a considerable impact on the spread of the disease in question. In recent years the Nottingham group has given particular attention to this issue by developing, analysing and using various models appropriate for certain kinds of diseases. For example, considerable progress has been made in the understanding of epidemics that are propogated among populations made up of households, in which individuals are typcially more likely to pass on a disease to those in their household than those elsewhere. Other examples of structured populations include those with spatial features (e.g. farm animals placed in pens; school children in classrooms; trees planted in certain configurations), and those with random social structure (e.g. using random graphs to describe an individual's contacts). Projects in this area are concerned with novel advances in the area, including developing and analysing appropriate new models, and methods for statistical inference (e.g. using pseudolikelihood and Markov chain Monte Carlo methods). 
Relevant Publications 

Other information 
Title  Bayesian Inference for Complex Epidemic Models 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Philip O'Neill 
Description  Dataanalysis for reallife epidemics offers many challenges; one of the key issues is that infectious disease data are usually only partially observed. For example, although numbers of cases of a disease may be available, the actual pattern of spread between individuals is rarely known. This project is concerned with the development and application of methods for dealing with these problems, and involves using Markov Chain Monte Carlo (MCMC) techniques. 
Relevant Publications 

Other information 
Title  Bayesian model choice assessment for epidemic models 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Philip O'Neill 
Description  During the last decade there has been a significant progress in the area of parameter estimation for stochastic epidemic models. However, far less attention has been given to the issue of model adequacy and assessment, i.e. the question of how well a model fits the data. This project is concerned with the development of methods to assess the goodnessoffit of epidemic models to data. 
Relevant Publications 

Other information 
Title  Epidemics on random networks 

Group(s)  Statistics and Probability, Mathematical Medicine and Biology 
Proposer(s)  Prof Frank Ball 
Description  There has been considerable interest recently in models for epidemics on networks describing social contacts. In these models one first constructs an undirected random graph, which gives the network of possible contacts, and then spreads a stochastic epidemic on that network. Topics of interest include: modelling clustering and degree correlation in the network and analysing their effect on disease dynamics; development and analysis of vaccination strategies, including contact tracing; and the effect of also allowing for casual contacts, i.e. between individuals unconnected in the network. Projects in this area will address some or all of these issues. 
Relevant Publications 

Other information 