You may contact a Proposer directly about a specific project or contact the Postgraduate Admissions Secretary with general enquiries.

Title Network performance subject to agent-based dynamical processes
Group(s) Industrial and Applied Mathematics, Statistics and Probability
Proposer(s) Dr Keith Hopcraft, Dr Simon Preston
Description

Networks – systems of interconnected elements – form structures through which information or matter is conveyed from one part of an entity to another, and between autonomous units. The form, function and evolution of such systems are affected by interactions between their constituent parts, and perturbations from an external environment. The challenge in all application areas is to model effectively these interactions which occur on different spatial- and time-scales, and to discover how

i)     the micro-dynamics of the components influence the evolutionary structure of the network, and

ii)    the network is affected by the external environment(s) in which it is embedded.

Activity in non-evolving networks is well characterized as having diffusive properties if the network is isolated from the outside world, or ballistic qualities if influenced by the external environment. However, the robustness of these characteristics in evolving networks is not as well understood. The projects will investigate the circumstances in which memory can affect the structural evolution of a network and its consequent ability to function.

Agents in a network will be assigned an adaptive profile of goal- and cost-related criteria that govern their response to ambitions and stimuli. An agent then has a memory of its past behaviour and can thereby form a strategy for future actions and reactions. This presents an ability to generate ‘lumpiness’ or granularity in a network’s spatial structure and ‘burstiness’ in its time evolution, and these will affect its ability to react effectively to external shocks to the system. The ability of externally introduced activists to change a network’s structure and function - or agonists to test its resilience to attack - will be investigated using the models. The project will use data of real agent’s behaviour.

Relevant Publications
Other information
Title Fluctuation Driven Network Evolution
Group(s) Industrial and Applied Mathematics, Statistics and Probability
Proposer(s) Dr Keith Hopcraft, Dr Simon Preston
Description

A network’s growth and reorganisation affects its functioning and is contingent upon the relative time-scales of the dynamics that occur on it. Dynamical time-scales that are short compared with those characterizing the network’s evolution enable collectives to form since each element remains connected with others in spite of external or internally generated ‘shocks’ or fluctuations. This can lead to manifestations such as synchronicity or epidemics. When the network topology and dynamics evolve on similar time-scales, a ‘plastic’ state can emerge where form and function become entwined. The interplay between fluctuation, form and function will be investigated with an aim to disentangle the effects of structural change from other dynamics and identify robust characteristics.

Relevant Publications
Other information
Title Numerical methods for stochastic partial differential equations
Group(s) Scientific Computation, Statistics and Probability
Proposer(s) Prof Michael Tretyakov
Description

Numerics for stochastic partial differential equations (SPDEs) is one of the central topics in modern numerical analysis. It is motivated both by applications and theoretical study. SPDEs essentially originated from the filtering theory and now they are also widely used in modelling spatially distributed systems from physics, chemistry, biology and finance acting in the presence of fluctuations. The primary objectives of this project include construction, analysis and testing of new numerical methods for SPDEs.

 

Relevant Publications
Other information

Web-page http://www.maths.nott.ac.uk/personal/pmzmt

Title Uncertainty quantification for evolutionary PDEs
Group(s) Scientific Computation, Statistics and Probability
Proposer(s) Dr Kris van der Zee
Description

Uncertainty quantification for evolutionary PDEs
(Or- Can we rely on our computational simulations, no matter how accurate they are?)

The field of uncertainty quantification (UQ), as applied to partial differential equations (PDEs), provides a means for understanding the effect of uncertainty in the parameters on output quantities of interest. This is, for example, extremely important in evolutionary models that describe the degradation of nuclear waste containment systems: A small uncertainty in the parameters may have a large influence on the degradation, and thus may result in unexpected damaged systems that leak nuclear waste. Fortunately, observational data can help reduce uncertainty in models. The models can then become predictive and allow for an assessment of its true future behavior!

Challenges for students:
* How can one quantify the uncertainty in the complex evolutionary models?
* Can one determine suitable observation scenarios using Bayesian experimental design principles?
* How complex does a model need to be for a predictive simulation?
* Can we employ UQ methodologies to reliably predict degradation of nuclear waste containment systems?

Depending on the interest of the student, several of these issues (or others) can be addressed.
Also, the student is encouraged to suggest a second supervisor, possibly from another group! 

Relevant Publications
  • I. Babuska, F. Nobile, and R. Tempone, A systematic approach to model validation based on Bayesian updates and prediction related rejection criteria, Comput. Meth. Appl. Mech. Engrg. 197 (2008), pp. 2517-2539
Other information
  • This work connects to a large interdisciplinary multi-university EPSRC project.
Title Index policies for stochastic optimal control
Group(s) Statistics and Probability
Proposer(s) Dr David Hodge
Description

Since the discovery of Gittins indices in the 1970s for solving multi-armed bandit processes the pursuit of optimal policies for this very wide class of stochastic decision processes has been seen in a new light. Particular interest exists in the study of multi-armed bandits as problems of optimal allocation of resources (e.g. trucks, manpower, money) to be shared between competing projects. Another area of interest would be the theoretical analysis of computational methods (for example, approximative dynamic programming) which are coming to the fore with ever advancing computer power.


Potential project topics could include optimal decision making in the areas of queueing theory, inventory management, machine maintenance and communication networks.

Relevant Publications
Other information

Keywords: multi-armed bandits, dynamic programming, Markov decision processes

Title Uncertainty quantification in palaeo-climate reconstruction
Group(s) Statistics and Probability
Proposer(s) Dr Richard Wilkinson
Description

The climate evolves slowly. Even if we stopped emitting green-houses gases today, we wouldn't see the full effect of the damage already done for at least another 100 years. The instrumental record of past climate and weather goes back at most 300 years, and before then we have to rely on indirect (and inaccurate) data sources. Because of the slow evolution of the climate, this is like only having a very small number of acccurate observations, and so consquently we have very little information that can be used to assess the accuracy of climate simulators, which are the key tool used for predicting how the climate will behave in the future.

An important source of information on what the climate was like in the past comes from proxy data sources such as pollen taken from lake deposits, or measurements of the air-content (specifically the ratio of oxygen-18 to oxygen-16) stored in glaciers hundreds of thousands of years ago. Reconstructing past climate from these data sources is a difficult task as the measurements are noisy, correlated, and don't have accurate dates attached to them, yet the task is important if we are to understand how the climate evolves and hence be able to predict the future.

In this project, we will look at statistical methods for accurate palaeo-climate reconstruction, and aim to provide believeable uncertainty quantifications that accurately represent our degree of confidence/ignorance about what the climate was like in the past. The complex nature of the problem means that it is likely that state-of-the-art Monte Carlo methods will be needed, as well as potentially developing new methods in order to do the inference.

 

Relevant Publications
Other information
Title Semi-Parametric Time Series Modelling Using Latent Branching Trees
Group(s) Statistics and Probability
Proposer(s) Dr Theodore Kypraios
Description

A class of semi-parametric discrete time series models of infinite order where we are be able to specify the marginal distribution of the observations in advance and then build their dependence structure around them can be constructed via an artificial process, termed as Latent Branching Tree (LBT). Such a class of models can be very useful in cases where data are collected over long period and it might be relatively easy to indicate their marginal distribution but much harder to infer about their correlation structure. The project is concerned with the development of such models in continuous-time as well as developing efficient methods for making Bayesian inference for the latent structure as well as the model parameters. Moreover, the application of such models to real data would be also of great interest.

Relevant Publications
Other information
Title Bayesian methods for analysing computer experiments
Group(s) Statistics and Probability
Proposer(s) Dr Richard Wilkinson
Description

Computer experiments (ie simulators) are used in nearly all areas of science and engineering. The statistical analysis of computer experiments is an exciting and rapidly growing area of statistics which looks at the question of how best to learn from computer models. Examples of the types of challenges faced, and possible areas for a Ph.D, are given below.

(i) Computer models are often process models where the likelihood function is intractable (as is common in genetics, ecology, epidemiology etc) and so to do inference we have to use likelihood-free techniques. Approximate Bayesian computation (ABC) methods are a new class of Monte Carlo methods that are becoming increasingly popular with practitioners, but which are largely unstudied by statisticians, and there remains many open questions about their performance. Application areas which use ABC methods are mostly biological (genetics and ecology in particular), but their use is growing across a wide range of fields.

(ii) Expensive simulators which take a considerable amount of time to run (eg, climate models), present the challenge of how to learn about the model (its parameters, validity, or its predictions etc.) when we only have a limited number of model evaluations available for use. A statistical tool developed in the last decade is the idea of building statistical emulators of the simulator. Emulators are cheap statistical models of the simulator (models of the model) which can be used in place of the simulator to make inferences, and are now regularly used in complex modelling situations such as in climate science. However, there are still many questions to be answered about how best to build and then use emulators. Possible application areas for these methods include climate science, and engineering (such as ground-water flow problems for radio-active waste), as well as many others.

(iii) "All models are wrong, but some are useful" - In order to move from making statements about a model to making statements about the system the model was designed to represent, we must carefully quantify the model error - interest lies in what will actually happen, rather than in what your model says will happen! Failure to account for model errors can mean that different models of the same system can give different predictions (see for example the controversy regarding the differing predictions of the large climate models - none of which account for model error!). Assessing and incorporating model error is a new and rapidly growing idea in statistics, and is done by a combination of subjective judgement and statistical learning from data. The range of potential application areas is very wide, but in particular meterology and mechanical engineering are areas where these methods are needed.

Relevant Publications
  • Wilkinson, Approximate Bayesian computation (ABC) gives exact results under the assumption of model error, in submission. Available as arXiv:0811.3355.
  • Wilkinson, Bayesian calibration of expensive multivariate computer experiments. In ‘Large-scale inverse problems and quantification of uncertainty’, 2010, John Wiley and Sons, Series in Computational Statistics.
  • Quantifying simulator discrepancy in discrete-time dynamical simulators, Wilkinson, M. Vrettas, D. Cornford, J. E. Oakley. Journal of Agricultural, Biological, and Environmental Statistics: Special issue on Computer models and spatial statistics for environmental science, 16(4), 554-570, 2011
Other information

See http://www.maths.nottingham.ac.uk/personal/pmzrdw/ for more information.

Title Ion channel modelling
Group(s) Statistics and Probability
Proposer(s) Prof Frank Ball
Description

The 1991 Nobel Prize for Medicine was awarded to Sakmann and Neher for developing a method of recording the current flowing across a single ion channel. Ion channels are protein molecules that span cell membranes. In certain conformations they form pores allowing current to pass across the membrane. They are a fundamental part of the nervous system. Mathematically, a single channel is usually modelled by a continuous time Markov chain. The complete process is unobservable but rather the state space is partitioned into two classes, corresponding to the receptor channel being open or closed, and it is only possible to observe which class of state the process is in. The aim of single channel analysis is to draw inferences about the underlying process from the observed aggregated process. Further complications include (a) the failure to detect brief events and (b) the presence of (possibly interacting) multiple channels. Possible projects include the development and implementation of Markov chain Monte Carlo methods for inferences for ion channel data, Laplace transform based inference for ion channel data and the development and analysis of models for interacting multiple channels.

Relevant Publications
Other information
Title Optimal control in yield management
Group(s) Statistics and Probability
Proposer(s) Dr David Hodge
Description

Serious mathematics studying the maximization of revenue from the control of price and availability of products has been a lucrative area in the airline industry since the 1960s. It is particularly visible nowadays in the seemingly incomprehensible price fluctuations of airline tickets. Many multinational companies selling perishable assets to mass markets now have large Operations Research departments in-house for this very purpose. This project would be working studying possible innovations and existing practices in areas such as: customer acceptance control, dynamic pricing control and choice-based revenue management. Applications to social welfare maximization, away from pure monetary objectives, and the resulting game theoretic problems are also topical in home energy consumption and mass online interactions.

Relevant Publications
Other information
Title Stochastic Processes on Manifolds
Group(s) Statistics and Probability
Proposer(s) Dr Huiling Le
Description

As well as having a wide range of direct applications to physics, economics, etc, diffusion theory is a valuable tool for the study of the existence and characterisation of solutions of partial differential equations and for some major theoretical results in differential geometry, such as the 'Index Theorem', previously proved by totally different means. The problems which arise in all these subjects require the study of processes not only on flat spaces but also on curved spaces or manifolds. This project will investigate the interaction between the geometric structure of manifolds and the behaviour of stochastic processes, such as diffusions and martingales, upon them.

Relevant Publications
Other information
Title Analytic methods in probability theory
Group(s) Statistics and Probability
Proposer(s) Notice: Undefined index: pmzsu in /maths/www/html/postgraduate/projects/index.php on line 477 Notice: Undefined index: pmzsu in /maths/www/html/postgraduate/projects/index.php on line 477
Description

My research focuses on interactions between probability theory, combinatorics and topology. Topics of particular interest:

  • dependence, limit theorems for dependent variables with applications to dynamical systems, examples and counterexamples to limit theorems;
  • modern analytic methods in probability theory such as stochastic orderings, Stein's type operator, contraction, Poincare - Hardy - Sobolev type inequalities and their applications;
  • stochastic analysis of rare events such as Poisson approximation and large deviations;
  • Random groups, Hausdorff dimension and related topics
Relevant Publications
Other information
Title Statistical Theory of Shape
Group(s) Statistics and Probability
Proposer(s) Dr Huiling Le
Description

Devising a natural measure between any two fossil specimens of a particular genus, assessing the significance of observed 'collinearities' of standing stones and matching the observed systems of cosmic 'voids' with the cells of given tessellations of 3-spaces are all questions about shape.

It is not appropriate however to think of 'shapes' as points on a line or even in a euclidean space. They lie in their own particular spaces, most of which have not arisen before in any context. PhD projects in this area will study these spaces and related probabilistic issues and develop for them a revised version of multidimensional statistics which takes into account their peculiar properties. This is a multi-disciplinary area of research which has only become very active recently. Nottingham is one of only a handful of departments at which it is active.

Relevant Publications
Other information
Title Automated tracking and behaviour analysis
Group(s) Statistics and Probability
Proposer(s) Dr Christopher Brignell
Description

In collaboration with the Schools of Computer and Veterinary Science we are developing an automated visual surveillance system capable of identifying, tracking and recording the exact movements of multiple animals or people.  The resulting data can be analysed and used as an early warning system in order to detect illness or abnormal behaviour.  The three-dimensional targets are, however, viewed in a two dimensional image and statistical shape analysis techniques need to be adapted to improve the identification of an individual's location and orientation and to develop automatic tests for detecting specific events or individuals not following normal behaviour patterns.

Relevant Publications
Other information
Title Asymptotic techniques in Statistics
Group(s) Statistics and Probability
Proposer(s) Prof Andrew Wood
Description

Asymptotic approximations are very widely used in statistical practice. For example, the large-sample likelihood ratio test is an asymptotic approximation based on the central limit theorem. In general, asymptotic techniques play two main roles in statistics: (i) to improve understanding of the practical performance of statistics procedures, and to provide insight into why some proceedures perform better than others; and (ii) to motive new and improved approximations. Some possible topics for a Ph.D. are

  • Saddlepoint and related approximations
  • Relative error analysis
  • Approximate conditional inference
  • Asymptotic methods in parametric and nonparametric Bayesian Inference
Relevant Publications
Other information
Title Statistical Inference for Ordinary Differential Equations
Group(s) Statistics and Probability
Proposer(s) Dr Theodore Kypraios, Dr Simon Preston, Prof Andrew Wood
Description

Ordinary differential equations (ODE) models are widely used in a variety of scientific fields, such as physics, chemistry and biology.  For ODE models, an important question is how best to estimate the  model parameters given experimental data.  The common (non-linear  least squares) approach is to search parameter space for parameter values that minimise the sum of squared differences between the model solution and the experimental data. However, this requires repeated numerical solution of the ODEs and thus is computationally expensive; furthermore, the optimisation's objective function is often highly multi-modal making it difficult to find the global optimum.  In this project we will develop computationally less demanding  likelihood-based methods, specifically by using spline regression  techniques that will reduce (or eliminate entirely) the need to solve numerically the ODEs.

Relevant Publications
Other information
Title Bayesian approaches in palaeontology
Group(s) Statistics and Probability
Proposer(s) Dr Richard Wilkinson
Description

Palaeontology provides a challenging source of problems for statisticians, as fossil data are usually sparse and noisy. Methods from statistics can be used to help answer scientific questions such as when did species originate or become extinct, and how diverse was a particular taxonomic group. Some of these questions are of great scientific interest - for example - did primates coexist with dinosaurs in the Cretaceous? There is no hard evidence either way, but statistical methods can be used to assess the probability that they did coexist.

This project will involve building a stochastic forwards model of an evolutionary scenario, and then fitting this model to fossil data. Quantifying different sources of uncertainty is likely to play a key part in the analysis.

Relevant Publications
  • Dating primate divergences through an integrated analysis of palaeontological and molecular data, Wilkinson, M. Steiper, C. Soligo, R.D. Martin, Z. Yang, and S. Tavare, Systematic Biology, 60(1): 16-31, 2011.
Other information

See http://www.maths.nottingham.ac.uk/personal/pmzrdw/ for more information

Title Statistical shape analysis with applications in structural bioinformatics
Group(s) Statistics and Probability
Proposer(s) Dr Christopher Fallaize
Description

In statistical shape analysis, objects are often represented by a configuration of landmarks, and in order to compare the shapes of objects, their configurations must first be aligned as closely as possible. When the landmarks are unlabelled (that is, the correspondence between landmarks on different objects is unknown) the problem becomes much more challenging, since both the correspondence and alignment parameters need to be inferred simultaneously.

An example of the unlabelled problem comes from the area of structural bioinformatics, when we wish to compare the 3-d shapes of protein molecules. This is important, since the shape of a protein is vital to its biological function. The landmarks could be, for example, the locations of particular atoms, and the correspondence between atoms on different proteins is unknown. This project will explore methods for unlabelled shape alignment, motivated by the problem of protein structure alignment. Possible topics include development of:
i) efficient MCMC methods to explore complicated, high-dimensional distributions, which may be highly multimodal when considering large proteins;
ii) fast methods for pairwise alignment, needed when a large database of structures is to be searched for matches to a query structure;
iii) methods for the alignment of multiple structures simultaneously, which greatly exacerbates the difficult problems faced in pairwise alignment.   

Relevant Publications
  • Green, P.J. and Mardia, K.V. (2006) Bayesian alignment using hierarchical models, with applications in protein bioinformatics. Biometrika, 93(2), 235-254.
  • Mardia, K.V., Nyirongo, V.B., Fallaize, C.J., Barber, S. and Jackson, R.M. (2011). Hierarchical Bayesian modeling of pharmacophores in bioinformatics. Biometrics, 67(2), 611-619.
Other information
Title High-dimensional molecular shape analysis
Group(s) Statistics and Probability
Proposer(s) Prof Ian Dryden
Description

In many application areas it is of interest to compare objects
and to describe the variability in shape as an object evolves over time.
For example in molecular shape analysis it is common to have several thousand
atoms and a million time points. It is of great interest to reduce the
dimension to a relatively small number of dimensions, and to describe
the variability in shape and coverage properties over time. Techniques
from manifold learning will be explored, to investigate if the variability can
be effectively described by a low dimensional manifold. A recent method for
shapes and planar shapes called principal nested spheres will be adapted for
3D shape and surfaces. Also, other non-linear dimension reduction techniques such as
multidimensional scaling will be explored, which approximate the geometry
of the higher dimensional manifold. The project will involve collaboration
with Dr Charlie Laughton of the School of Pharmacy.

Relevant Publications
  • Jung, S., Dryden, I.L. and Marron, J.S. (2012). Analysis of principal nested spheres. Biometrika, 99, 551–568.
Other information
Title Uncertainty quantifcation for models with bifurcations
Group(s) Statistics and Probability, Industrial and Applied Mathematics
Proposer(s) Prof Ian Dryden Notice: Undefined index: pmzkac in /maths/www/html/postgraduate/projects/index.php on line 479 Notice: Undefined index: pmzkac in /maths/www/html/postgraduate/projects/index.php on line 479 ,
Description

The project will consider Uncertainty Quantification (UQ) when there are bifurcations or discontinuities in the models. Gaussian Process Emulators (GPE) and Generalised Polynomial Chaos (gPC) techniques will be used to construct fast approximations to high-cost deterministic models. Also, an important component of Bayesian UQ is the difficult task of elicitation of the prior distributions of the parameters of interest, which will be investigated. We will exploit the flexibility in the choice of GPE covariance function to deal with cases where the dependence on the inputs is not smooth. Lack of smoothness can be handled by dividing the parameter space into elements and using of gPC or GPE on each element, but this is difficult to do automatically. We propose to compute the hypersurfaces at which discontinuities occur, using techniques from numerical bifurcation theory, as preparation for discretising with gPC or GPE methods. Bifrucations arise in carbon sequestration applications, and radioactive waste disposal is another area where elicitation and Bayesian emulation are useful.

Relevant Publications
Other information
Title Statistical analysis of neuroimaging data
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Dr Christopher Brignell
Description

The activity of neurons within the brain can be detected by function magnetic resonance imaging (fMRI) and magnetoencephalography (MEG).   The techniques record observations up to 1000 times a second on a 3D grid of points separated by 1-10 millimetres.  The data is therefore high-dimensional and highly correlated in space and time.  The challenge is to infer the location, direction and strength of significant underlying brain activity amongst confounding effects from movement and background noise levels.  Further, we need to identify neural activity that are statistically significant across individuals which is problematic because the number of subjects tested in neuroimaging studies is typically quite small and the inter-subject variability in anatomical and functional brain structures is quite large.

Relevant Publications
Other information
Title Identifying fibrosis in lung images
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Dr Christopher Brignell
Description

Many forms of lung disease are characterised by excess fibrous tissue developing in the lungs.  Fibrosis is currently diagnosed by human inspection of CT scans of the affected lung regions.  This project will develop statistical techniques for objectively assessing the presence and extent of lung fibrosis, with the aim of identifying key factors which determine long-term prognosis.  The project will involve developing statistical models of lung shape, to perform object recognition, and lung texture, to classify healthy and abnormal tissue.  Clinical support and data for this project will be provided by the School of Community Health Sciences.

Relevant Publications
Other information
Title Modelling hospital superbugs
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Philip O'Neill
Description

The spread of so-called superbugs such as MRSA within healthcare settings provides one of the major challenges to patient welfare within the UK. However, many basic questions regarding the transmission and control of such pathogens remain unanswered. This project involves stochastic modelling and data analysis using highly detailed data sets from studies carried out in hospital, addressing issues such as the effectiveness of patient isolation, the impact of different antibiotics, and the way in which different strains interact with each other.

Relevant Publications
Other information
Title Modelling of Emerging Diseases
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Frank Ball
Description

When new infections emerge in populations (e.g. SARS; new strains of influenza), no vaccine is available and other control measures must be adopted. This project is concerned with addressing questions of interest in this context, e.g. What are the most effective control measures? How can they be assessed? The project involves the development and analysis of new classes of stochastic models, including intervention models, appropriate for the early stages of an emerging disease.

Relevant Publications
Other information
Title Structured-Population Epidemic Models
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Frank Ball
Description

The structure of the underlying population usually has a considerable impact on the spread of the disease in question. In recent years the Nottingham group has given particular attention to this issue by developing, analysing and using various models appropriate for certain kinds of diseases. For example, considerable progress has been made in the understanding of epidemics that are propogated among populations made up of households, in which individuals are typcially more likely to pass on a disease to those in their household than those elsewhere. Other examples of structured populations include those with spatial features (e.g. farm animals placed in pens; school children in classrooms; trees planted in certain configurations), and those with random social structure (e.g. using random graphs to describe an individual's contacts). Projects in this area are concerned with novel advances in the area, including developing and analysing appropriate new models, and methods for statistical inference (e.g. using pseudo-likelihood and Markov chain Monte Carlo methods).

Relevant Publications
Other information
Title Bayesian Inference for Complex Epidemic Models
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Philip O'Neill
Description

Data-analysis for real-life epidemics offers many challenges; one of the key issues is that infectious disease data are usually only partially observed. For example, although numbers of cases of a disease may be available, the actual pattern of spread between individuals is rarely known. This project is concerned with the development and application of methods for dealing with these problems, and involves using Markov Chain Monte Carlo (MCMC) techniques.

Relevant Publications
Other information
Title Bayesian model choice assessment for epidemic models
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Philip O'Neill
Description

During the last decade there has been a significant progress in the area of parameter estimation for stochastic epidemic models. However, far less attention has been given to the issue of model adequacy and assessment, i.e. the question of how well a model fits the data. This project is concerned with the development of methods to assess the goodness-of-fit of epidemic models to data.

Relevant Publications
Other information
Title Epidemics on random networks
Group(s) Statistics and Probability, Mathematical Medicine and Biology
Proposer(s) Prof Frank Ball
Description

There has been considerable interest recently in models for epidemics on networks describing social contacts.  In these models one first constructs an undirected random graph, which gives the network of possible contacts, and then spreads a stochastic epidemic on that network.  Topics of interest include: modelling clustering and degree correlation in the network and analysing their effect on disease dynamics; development and analysis of vaccination strategies, including contact tracing; and the effect of also allowing for casual contacts, i.e. between individuals unconnected in the network.  Projects in this area will address some or all of these issues.

Relevant Publications
  • Ball F G and Neal P J (2008) Network epidemic models with two levels of mixing. Math Biosci 212, 69-87.
  • Ball F G, Sirl D and Trapman P (2009) Threshold behaviour and final outcome of an epidemic on a random network with household structure. Adv Appl Prob 41, 765-796.
  • Ball F G, Sirl D and Trapman P (2010) Analysis of a stochastic SIR epidemic on a random network incorporating household structure. Math Biosci 224, 53-73.
Other information
Title Robustness-performance optimisation for automated composites manufacture
Group(s) Statistics and Probability, Scientific Computation
Proposer(s) Prof Frank Ball Notice: Undefined index: pmzkac in /maths/www/html/postgraduate/projects/index.php on line 479 Notice: Undefined index: pmzkac in /maths/www/html/postgraduate/projects/index.php on line 479 , , Prof Michael Tretyakov
Description

Multidisciplinary collaborations are a critical feature of material science research enabling integration of data collection with computational and/or mathematical modelling. This PhD study provides an exciting opportunity for an individual to participate in a project spanning research into composite manufacturing, stochastic modelling and statistical analysis, and scientific computing. The project is integrated into the EPSRC Centre for Innovative Manufacturing in Composites, which is led by the University of Nottingham and delivers a co-ordinated programme of research at four of the leading universities in composites manufacturing, the Universities of Nottingham, Bristol, Cranfield and Manchester.

This project focuses on the development of a manufacturing route for composite materials capable of producing complex components in a single process chain based on advancements in the knowledge, measurement and prediction of uncertainty in processing. The necessary developments comprise major manufacturing challenges. These are accompanied by significant mathematical problems, such as numerical solution of coupled non-linear partial differential equations with randomness, the inverse estimation of composite properties and their probability distributions based on real-time measurements and the formulation and solution of a stochastic model of the variability in fibre arrangements. The outcome of this work will enable a step change in the capabilities of composite manufacturing technologies to be made, overcoming limitations related to part thickness, component robustness and manufacturability as part of a single process chain, whilst yielding significant developments in mathematics with generic application in the fields of stochastic modelling and inverse problems.

The specific aims of this project are: (i) Stochastic simulation of multi-dimensional non-linear stochastic problems; (ii) Stochastic and statistical modelling of fibre variability in Automated Fibre Placement to permit the predictive simulation of range of potential outcomes conditional on monitoring observations made during the process; (iii) Solution of the anisotropic conductivity inverse problem under uncertainty to translate monitoring and simulation of observable parameters to uncertainty quantification of critical unobservable variables.

Relevant Publications
Other information

The PhD programme contains a training element, which includes research work as well as traditional taught material. The exact nature of the training will be mutually agreed by the student and their supervisors and will have a minimum of 30 credits (approximately ¼ of a Master course/taught component of an MSc course) of assessed training. The graduate programmes at the School of Mathematical Sciences and the EPSRC Centre for Innovative Manufacturing in Composites provide a variety of appropriate training courses.

We require an enthusiastic graduate with a 1st class degree in Mathematics (in exceptional circumstances a 2(i) class degree can be considered), preferably of the MMath/MSc level, with good programming skills and willing to work as a part of an interdisciplinary team. A candidate with a solid background in statistics and stochastic processes will have an advantage.

The studentship is available for a period of three and a half years and provides an annual stipend of £13,726 and full payment of Home/EU Tuition Fees. Students must meet the EPSRC eligibility criteria.