Technical Programme

The programme for the week is grouped into themes spread over 5 days. The technical talks range from introduction to the field at the start of the week, to details behind some typical methods and approaches, with the last few days introducing examples of industrial applications.

In addition to the lectures, there will two sessions looking at other complementary skills necessary to working in R&D, such as project and research planning and management.

Single page programme schedule


Mon 20 Nov : Introduction

The day will cover details of the UTOPIAE network, research goals and work programme over the next 4 years, with lectures introducing the basic principles behind optimisation and uncertainty quantification.

09:00-10:30
Introduction to UTOPIAE: What to expectMassimiliano Vasile, University of Strathclyde
The first talk of the school is to introduce everyone to the UTOPIAE network, and to the idea behind mixing uncertainty treatment and optimisation to topics within the realm of aerospace engineering. Prof Vasile will discuss the mission statement and vision for the next 4 years, and the aims and expectations for the research and training that form the cornerstone of this European innovative training network.
10:30-12:00
Introduction to OptimisationEdmondo Minisci, Annalisa Riccardi, Kerem Akartunali, University of Strathclyde
This lecture aims at providing an introduction to optimisation. In particular, the formulation of continuous and combinatorial optimisation problems, single and multi objective, will be presented in their mathematical formulation and fundamental theory, and an overview on the available algorithms for local and global continuous optimisation, and network and combinatorial optimisation will be given. At the end of the lecture the student will have acquired the terminology and the basic understanding of the theory and techniques to treat optimisation processes.
13:30-15:30
Introduction to uncertainty quantificationPietro Congedo, INRIA
This course will introduce some basic concepts in the field of uncertainty quantification. First, we will provide some notions for illustrating the importance of incorporating uncertainties in order to make reliable computational predictions of physical systems. Then, we will review some basic concepts of probability theory (Probability measures, statistical moments,stochastic process, etc), that are of interest for uncertainty quantification. Finally, we will illustrate the main steps for propagating uncertainties through a given computational model.
15:30-17:30
Introduction to polynomial chaosOlivier Le Maître, CNRS
This short course will introduce the stochastic spectral and Polynomial Chaos methods to approximate model outputs depending of on some uncertain input parameters. We shall start by reviewing the essential of spectral expansions, the properties and the construction of PC bases, before discussing alternative computational strategies for the determination of the expansion coefficients. The discussion on solution methods will emphasise the class of so-called non-intrusive methods (e.g., regression techniques, non-intrusive projection, sparse grid methods), which are well suited to the case of large-scale model applications. We shall finally review standard the sensitivity analyses that can be carried-out using the PC expansion.
Tues 21 Nov : Optimisation

The morning will look at more advanced optimisation techniques, while the afternoon will introduce optimisation under uncertainty and statistical methods.

08:30-10:30
An introduction to multi-objective evolutionary optimisationBoris Naujoks, Technische Hochschule Köln
Most real-world optimisation problems originally face multiple objectives. These could be as simple as quality and cost in production or lift and drag in aerodynamics. Of course, there are a few more examples available. To handle multiple objectives is a rather easy way, it’s common to either forget about some of them and optimise just with respect to the remaining ones or to somehow aggregate the objectives to only one. Ending up with only one objective has the advantage of having all well known optimisations methods available for solving the problem. However, there are alternative approaches as well.
One rather elegant alternative approach are set based optimisation methods like evolutionary algorithms. These handle different solutions according to their performance with respect to all considered objectives as a set. The talk will introduce different ways multi-objective optimisation problems may be handled with evolutionary algorithms, present some results and open issues and will end with an outlook towards coupling the methods with surrogate models and handling many objectives. Both last directions are expected to play a major role in UTOPIAE.
10:30-12:30
Sequential parameter optimisation: Concepts and applicationsThomas Bartz-Beielstein, Technische Hochschule Köln
Real-world optimisation problems often have very high complexity, due to multi-modality, constraints, noise or other crucial problem features. Sequential parameter optimisation (SPO) is a heuristic that combines classical and modern statistical techniques for the purpose of efficient optimisation. It can be applied in two manners: a) to efficiently tune and select the parameters of other search algorithms, or b) to optimise expensive-to-evaluate problems directly, via shifting the load of evaluations to a surrogate model.
SPO is especially useful in scenarios where no experience of how to choose the parameter setting of an algorithm is available, a comparison with other algorithms is needed, an optimisation algorithm has to be applied effectively and efficiently to a complex real-world optimisation problem, and the objective function is a black-box and expensive to evaluate. The SPO Toolbox (SPOT) provides enhanced statistical techniques such as design and analysis of computer experiments, different methods for surrogate modelling and optimisation to effectively use SPO in the above-mentioned scenarios. The current SPOT version is a complete redesign and rewrite of the original R package. New developments were added to the SPOT package. A Kriging model implementation, based on earlier Matlab code by Forrester et al. has been extended to allow for the usage of categorical inputs. Additionally, it is now possible to use stacking for the construction of ensemble learners. In this presentation, we show how the new interface of SPOT can be used to efficiently optimise the geometry of an industrial dust filter (cyclone).
13:30-14:30
Control parameters in evolutionary optimisationGregor Papa, Jožef Stefan Institute
The evolutionary computation algorithms, i.e. the algorithms inspired by biological evolution, take an important position within optimization techniques designed to solve difficult optimization problems, being nonlinear, non-convex, multi-modal and non-differentiable in the continuous parameter space, and might even require solving multiple contradictory objectives. Evolutionary algorithms (EA) apply principles of evolution found in nature, such as reproduction, mutation, recombination, and selection. The EAs are driven by control parameters, which are crucial for their efficient performance. The best control parameter values depend on the problem and by smart encoding the level of the problem difficulty might change. In addition, the EAs should be robust. While the fine-tuned algorithm parameters sometimes allow robust behaviour, the adaptive control is required to more effectively exploit and explore the search space. This includes some adaptive mechanisms on control parameters, since the best algorithm control parameter values depend on the current state of the optimisation process and thus change over time. As the adaptation of the algorithm control parameters depends on different scenarios, one should consider their influence: deterministic rule, self-adaptive rule, adaptive rule. Automatic setup of algorithms is one of the prerequisites to allow ease of use of the complex industrial optimization tools.
14:30-15:30
Multi-level approach in multi-objective optimisationPeter Korošec, Jožef Stefan Institute
Algorithms need to find high quality solutions to time-consuming problems quickly. Most real-world optimisation problems require time-consuming evaluations, which leave optimisation algorithms very little evaluations to find high quality solution in reasonable time. A multi-level approach is an efficient way to speed-up the search process. With multi-level approach, we reduce the accuracy of the algorithm at the beginning of the search and gradually increase it throughout the search process. The accuracy is reduced by coarsening search parameters or simplifying the problem evaluations (e.g., by using quick but less accurate surrogate models). During the search, the accuracy is improved by refining search parameters or improving the quality of problem evaluations. Both approaches, which can be combined, lead to quicker and efficient search process.
15:30-16:30
Bayesian techniques for epistemic UQ of RANS modelsWouter Edeling, Stanford University
Despite the increasing availability of high-performance computational resources, Reynolds-Averaged Navier-Stokes (RANS) closure models are projected to remain a widely used option for the prediction of turbulent flows. However, it is well known that the resulting predictions are potentially sensitive to parametric and model-form uncertainty.
The former concerns imperfectly known closure coefficients, while the latter deals with the uncertainty due to assumptions made in the mathematical formulation of the model itself. We will investigate various means in which Bayesian data assimilation can be used to combine (experimental) data with the RANS models, with the goal of obtaining of obtaining flow predictions with quantified uncertainty. As a first step we will outline the various components required for a Bayesian calibration of a single closure model, which results in a posterior probability density function of the closure coefficients. This includes a short discussion on Bayes’ theorem, likelihood specification and Markov-Chain Monte Carlo sampling. Once the calibration is finished, a critical next step is the use of the obtained posterior distribution in a predictive setting, when no reference data is available. However, performing a calibration on a single model, using only one data set introduces a bias which might not extrapolate well to flow scenarios with a very different topology. Ensemble methods can be used to reduce the bias introduced by the choice of model and calibration scenario. We will discuss a Bayesian method able to combine multiple closure models and data sets into a single stochastic model. While we focus on turbulence models in this lecture, the presented methods are general and can in principle be applied to all simulation codes subject to uncertainty.
16:30-17:30
Confidence in measurement: standards, traceability and uncertainty quantificationAlistair Forbes, UK National Physical Laboratory
Wed 22 Nov : Statistical methods

Introduction to imprecise probability (IP) theories and statistical modelling, and some applications.

08:30-09:30
Introduction to Imprecise Probability and IP-based statistical methodsFrank Coolen, University of Durham
This lecture will introduce basic ideas of imprecise probability (IP), generalizing the classical (precise) theory of probability as a tool for uncertainty quantification. IP will be motivated from several perspectives, including practical aspects like limited or conflicting information and decision support. Some main advances in the field will be highlighted, together with the many challenges remaining. Some statistical methods based on IP will be introduced with examples to illustrate advantages.
09:30-10:30
Introduction to system reliability with imprecise probabilityFrank Coolen, University of Durham
This lecture will introduce basic concepts of theory of system reliability, including the survival signature which enables uncertainty quantification for larger systems. The use of IP-based statistical methods for system reliability will be introduced. Several open problems will be discussed, in particular in relation to uncertainties to be considered in UTOPIAE projects, e.g. what theory and methods may be required for meaningful uncertainty quantification on system reliability at early design stages.
10:30-11:30
Statistical methods for system reliabilityLouis Aslett, University of Durham
This lecture will introduce both non-parametric and parametric Bayesian modelling of system reliability, making use of the survival signature. This will be strongly grounded in implementation using the R programming language, showing how a system design can be expressed so that the survival signature may be computed automatically and component reliabilities or parameter posteriors inferred from test data. This will then be extended to an imprecise Bayesian nonparametric setting where flexible prior sets may be specified over component reliabilities.
11:30-12:30, 13:30-14:30
Imprecise Markov chains: from basic theory to applications - GDC

Imprecise Markov chains: from basic theory to applications - JDB
Gert de Cooman, Jasper de Bock, Ghent University
This lecture provides an initiation into the theory of imprecise Markov chains. A Markov chain is a stochastic process that can be, at any time, in a number of possible states. It is special in that it satisfies a conditional independence (or Markov) condition: given the present state, the future states are stochastically independent of the past states. This guarantees that the behaviour of such a stochastic process is completely determined by (i) a probability distribution for the initial states, and (ii) a ‘transition’ matrix describing the probabilities of going from one state to the next at a given time. In an imprecise Markov chain, this is modified or weakened in three ways: (a) the initial and transition probabilities are allowed to be imprecisely specified, (b) the transition probabilities are allowed to depend on time, and (c) the stochastic independence in the Markov condition is weakened to a so-called epistemic irrelevance condition.

In a first part, we discuss the basics of imprecise Markov chains in discrete time. We show how to describe such a system mathematically, and how to efficiently perform inferences about the time evolution of such systems. We discuss the similarities and differences with (precise) Markov chains. We also study the so-called stationary, or long-term, behaviour of imprecise Markov chains, and its relation to the notion of ergodicity.
14:30-15:30
Response surface methodsMariapia Marchi, ESTECO
Response Surface Models (RSMs) or metamodels are statistical and numerical models that approximate the relationship between multiple input variables and an output (response) variable. This lecture introduces RSMs and their importance for engineering design optimisation. Basics concepts like approximating and interpolating models and overfitting are presented. After discussing some metrics to assess metamodel accuracy (performance indices), a few RSM methods are presented in more detail: least squares (linear and non-linear), radial basis functions, Kriging and neural networks. As an application, an aerodynamic guidance study of a SUV-type vehicle is shown.
Thu 23 Nov : Optimisation under uncertainty with applications

Robust and reliability based design optimisation, model reduction and examples within the aviation sector.

08:30-09:30
Statistical modelling and regularisationJochen Einbeck, University of Durham
In regression models where the number of input variables is large (possibly larger than the number of observations), standard estimation techniques such as least squares or maximum likelihood become infeasible or impractical. This lecture discusses a modern regression technique, known under the term LASSO, which produces sparse solutions by shrinking the coefficients of non-relevant variables to 0, hence implicitly performing variable selection. This is achieved through a regularisation term which imposes a penalty on the total size of the coefficients. A Bayesian version of the LASSO is also introduced which allows for uncertainty quantification, both in terms of the variable selection and the parameter estimation. Possible connections to IP are briefly touched upon.
09:30-10:30
Robust design optimisation, uncertainty quantification and design for reliabilityMariapia Marchi, ESTECO
This lecture reviews the motivation of using uncertainty quantification methods in the context of engineering design optimisation and presents some of the challenges of optimisation under uncertainty. A theoretical introduction to robust and reliability-based design optimisation problems and (sparse) polynomial chaos approaches is provided. A few practical examples are shown.
10:30-12:30
Model reduction for high dimensional robust designStefan Goertz, DLR
This lecture will start with some aerodynamic examples that motivate the use of efficient non-intrusive methods for quantifying output uncertainty using numerical simulation. Both operational and geometrical input uncertainties will be considered. The parameterisation of geometrical uncertainties for airfoils and wings is discussed, leading to high-dimensional UQ problems. Dimensionality reduction and reduced-order modelling methods are introduced to tackle problems with high-dimensional input uncertainties and high-dimensional outputs uncertainties. Building on the UQ methods, stochastic optimisation techniques are applied to allow geometric variability in the design process to be considered and permit the robust design of airfoils, i.e., designs that are less sensitive to small random perturbations.
13:30-14:30
Using expert knowledge elicitation to quantify uncertaintyAbigail Colson, University of Strathclyde
Quantifying uncertainty is unavoidable and something that must be done for almost every realistic problem involving decision-making. In particular, quantifying uncertainty for future events or where no data has been observed is becoming increasingly unavoidable and presents many challenges. For over thirty years, researchers from mathematics, philosophy, psychology, statistics, etc., have been working on developing protocols that ensure that expert judgement is treated as scientific data subjected to methodological rules for quality and consistency. This presentation aims to give an overview of the methodological requirements for eliciting structured expert judgements and the different contexts when expert judgement may be necessary. It will also discuss the challenge of validating expert judgements.
14:30-16:30
In-flight icing: modelling, prediction and uncertaintiesAlberto Guardone, Politecnico di Milano
An overview of the modelling, numerical and experimental techniques for of in-flight icing in aeronautical applications is given. First, the physical background on ice accretion over wings, engine nacelles, rotor blades and appendices is set. Modelling of in-flight ice accretion is then presented, with particular reference to the ice accretion process described by the Stefan problem. Different ice accretion models are discussed and compared. Particle tracking techniques to compute the impact point of the cloud droplets are also presented, together with numerical issues related to the icing solver. Experimental activities in the field are also reported. State-of-the-art anti-ice, de-ice and ice mitigation systems are briefly discussed, together with open modelling issues related to icing prediction, including the modelling of Supercooled Large Droplets (SLD), of super-hydrophobic surfaces and of ice-shedding. A preliminary assessment of the uncertainties influencing the ice accretion problem is given.
16:30-17:30
Uncertainty management application to design load setting & robust wing twist MDA on A330 NEOSanjiv Sharma, Airbus Operations
The demands on the design and engineering of aircraft are increasing on several fronts. For example, responding to market situations in a timely manner; addressing requirements and expectations for ever-increasing performance; and evaluating the risks, opportunities and values of embedding new technologies into the product. To address these demands a Set-Based Design approach under uncertainty for early design evaluations is presented. The challenges associated with these early evaluations are illustrated using an industrial case study based on the A330-NEO design loads and wing twist analyses. The talk concludes with topics that require further research to address early analyses in a multi-disciplinary systems engineering context.
Fri 24 Nov : Optimisation under uncertainty with applications

Uncertainty quantification methods in CFD and optimisation under uncertainty with imprecise probabilities.

08:30-10:30
Risk measures in the context of robust and reliability based optimisationDomenico Quagliarella, CIRA
Many industrial optimisation processes must take account of the stochastic nature of the system and processes to be designed or re-designed and have to consider the random variability of some of the parameters that describe them. Thus, it is necessary to characterise the system that is being studied from various points of view related to the treatment of uncertainty. This talk is related to the use of various risk measures in the context of robust and reliability based optimisation. We start from the definition of risk measure and its formal setting and then we show how different risk functional definitions can lead to different approaches to the problem of optimisation under uncertainty. In particular, the application of value-at-risk (VaR) and conditional value-at-risk (CVaR), also called quantiles and superquantiles, is here illustrated. These risk measures originated in financial engineering, but they are very well and naturally suited to robust and reliability-based design optimisation problems and they represent a possible alternative to more traditional robust design approaches. We will then discuss the implementation of an efficient risk-measure based optimisation algorithm based on the introduction of the Weighted Empirical Cumulative Distribution Function (WECDF) and on the use of methods for changing the probability measure. Finally, we will discuss the problems related to the error in the estimation of the risk function and we will illustrate the “bootstrap” computational statistics technique to get an estimate of the standard error on VaR and CVaR. Finally, we will report some application examples of this approach to robust and reliability based optimisation.
10:30-12.30
UQ challenges in high enthalpy flow ground testingThierry Magin, VKI
13:30-14:30
Evidence-Based Robust Optimisation: An introductionMassimiliano Vasile, University of Strathclyde
This lecture introduces basic concepts of Dempster-Shafer Theory of Evidence and how to pair it to global optimisation algorithms to compute a robust and reliable solution under epistemic uncertainty. The lecture focuses on the computational aspects with particular emphasis on techniques to reduce the computational complexity and obtain an approximated solution in reasonable time. After introducing the basic concepts of Belief and Plausibility and their formal computation, the lecture will formulate three different optimisation problems under uncertainty: evidence-based robust optimisation, evidence-based reliability optimisation and resilience optimisation under epistemic uncertainty.
The lecture will then present a few algorithms to compute worst-case scenarios solutions and to optimise the Belief in the realisation of a particular optimal solution. A few practical examples will conclude the lecture.
14:30-15:30
Filtering and state estimation: Basic conceptsMassimiliano Vasile, University of Strathclyde
The lecture will present some basic concepts of sequential filtering and state estimation. Starting from the idea underneath particle filtering and the associated inference process the lecture will then present other types of sequential filtering including linear and unscented Kalman filtering and H-infinity filtering. The lecture will revisit some of these state estimation techniques from an Uncertainty Quantification perspective and will show some examples of applications to space problems.