Specialisation Stochastics AM 2023 en
Specialisation Stochastics AM 2023 en
22-Jun-2024 11:35
Year 2023/2024
Organization Electrical Engineering, Mathematics and Computer Science
Education Master Applied Mathematics
Page 1 of 20
1.
Year 2023/2024
Organization Electrical Engineering, Mathematics and Computer Science
Education Master Applied Mathematics
All of the compulsory specialisation courses can also be chosen as recommended specialisation course.
Page 2 of 20
Year 2023/2024
Organization Electrical Engineering, Mathematics and Computer Science
Education Master Applied Mathematics
Compulsory Courses
Page 3 of 20
WI4455 Statistical Inference 6
Responsible Instructor Prof.dr.ir. G. Jongbloed
Co-responsible for T.F.W. van der Jagt
assignments
Contact Hours / Week 4/4/0/0
x/x/x/x
Education Period 1
2
Start Education 1
Exam Period 2
3
Course Language English
Expected prior knowledge - Introductory statistics and probability course at bachelor level (such as: various probability distributions, central limit theorem,
estimators, maximum likelihood estimation, frequentist hypothesis testing).
- Familiarity with concepts from measure theoretic probability.
Course Contents This course aims to provide a concise but comprehensive account of the essential elements of statistical inference and theory.
The course focusses on a clear presentation of the main concepts and results underlying different frameworks of inference.
Study Goals Study goals Statistical Inference WI4455
The topics correspond to chapters in the lecture notes.
1. Preliminaries:
- The student is able to reproduce the definition of a statistical model, possibly defined in terms of densities.
- The student is able to explain the unifying concept of exponential families (the precise form of the density will be given at the
exam)
- The student is able to assess fundamental differences between classical- and Bayesian approaches towards problems in the
given examples.
- The student is able to identify with the different notions of stochastic convergence and is able to apply the Delta-method.
4. Bayesian analysis:
- The student is able to explain the Bayesian approach towards statistical inference and can apply Bayes' rule to do Bayesian
computations in case a closed form solution is available.
- The student is able to describe the concept of exchangeability and its relation to justifying Bayesian analysis.
- The student can derive Jeffreys prior in a given problem setting.
- The student can construct hierarchical models.
- The students can explain how empirical Bayes methods are defined, and can compute empirical Bayes estimators in a given
problem setting.
- The student can state the Bernstein-Von Mises theorem and its implications.
Education Method Each week there will be a 2-hours lecture and a scheduled meeting where exercises from the lecture notes are discussed.
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 4 of 20
Year 2023/2024
Organization Electrical Engineering, Mathematics and Computer Science
Education Master Applied Mathematics
Page 5 of 20
WI4052 Risk Analysis 6
Responsible Instructor Dr. D. Kurowicka
Contact Hours / Week 2/2/0/0
x/x/x/x
Education Period 1
2
Start Education 1
Exam Period 2
3
Course Language English
Expected prior knowledge Basic Probability and Statistics
Summary Risk analysis course will treat problems often appearing in applications. Some generic methods that are applied in many different
areas of applications will be discussed. We will not concentrate on any particular application, rather we will discuss important
assumptions made in different models and their applicability for different real life problems. Moreover we show how parameters
of these models can be obtained from data and discuss methods of validation of the obtained model.
Course Contents Risk modelling, life distributions, event trees, fault trees, Bayesian Belief Nets, reliability data bases, dependence modelling,
software reliability, expert judgment.
Course Contents Introduction & Uncertainty
Continuation Probability
Multivariate, Copula
Stochastic processes
Bayesian, classical estimations &Testing
Fault trees
Bayesian belief nets
Dependent failures
Expert Judgment
Reliability data bases
Software reliability
Study Goals Students must be able to justify use of probabilistic methods in risk analysis and apply techniques like fault trees and Bayesian
belief nets. They should know how to estimate parameters of models for dependent failure rates. Moreover they should be able to
perform competing risks analysis for simple reliability data sets. Students should understand basic techniques of incorporating
expert knowledge into risk models.
Study Goals continuation The goal of the course is to expose students to problems appearing in applications. During lectures the theory will be treated and
exercises will be provided. Students will be required to complete the assignment where some practical problems have to be
solved.
The theoretical questions are not required. However making them allows to get a feedback and extra explanation in how they
should be solved. These exercises together with course material will constitute basis for the exam in January.
Practical exercises are required and will form part of your final grade.
Education Method Lectures and Assignments
Literature and Study Parts of the book "Probabilistic Risk Analysis", Bedford & Cooke, Cam.U. press, 2001.
Materials Extra material on the Brightspace
Reader Brightspace
Assessment Written exam (100%).
In case of an insufficient result, repair opportunities may be offered in accordance with TER Implementation Regulations Art 5,
sub 5
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Permitted Materials during Calculator and self made A4 with formulas.
Tests
Tags Mathematics
Stochastics
Page 6 of 20
WI4230 Time Series & Extreme Value Theory 6
Responsible Instructor Dr. rer. nat. F. Mies
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
5
Course Language English
Expected prior knowledge Knowledge of probability and statistics at the level of the BSc Applied Mathematics courses on these topics (AM1080,
AM2080).
Course Contents The course consists of two parts: Extreme Value Theory and Time Series.
The course considers questions such as: What is the chance that the price of oil will be higher than $120/Barrel coming week?
Can we expect that UK interest rates will rise next year? How can we assess if global warming is occurring? What is the chance
that the daily loss of the stock price of ABN-AMRO is larger than, say 20%, which has never occurred before? How to
determine the height of a dike such that the probability of a flood, i.e., the water level exceeding the dike, in a given year is
1/10,000? What is the impact of a dramatic decline in the US stock market on DAX 30 index?
In this course students will build up some theoretical basis, mainly probabilistic results within the area of time series and extreme
value theory. Moreover, students will learn statistical methods for dealing with non-stationary data, multivariate data and rare
events and assessing extreme risks for data from various fields such as finance, actuarial science, hydrology and environmental
science.
Furthermore the student is capable to perform a statistical analysis with the statistical package R.
The Extreme Value Theory part considers topics such as max-domain attractions, peak over threshold approach, estimations of
extreme value index, tail probability and high quantile, and tail dependence and independence.
The Time Series part will cover topics such as ARMA models, spectral analysis, non-stationarity, GARCH models, state space
models.
Implementations of statistical methods in R will be part of the course. Examples of real applications from environmental science
and finance will also be discussed.
Study Goals Specific learning objectives for the Extreme Value Theory part are:
1. Validate the max-domain attraction conditions, that is to show if a given distribution is in the max domain of attraction or not.
3. Develop estimations for tail probabilities, high quantiles, m-year return level, using different methods.
5. Evaluate the performance of estimation methods by simulation study; and apply the estimation methods to a real data set.
6. Establish the properties of exponent measure and spectral measure. Establish the link between the bivariate extreme value
distribution and these two measures.
1. Discern whether a given time series data set can be associated to a stationary time series, and if not, perform suitable
transformations to so that the resulting data corresponds to a stationary time series.
2. Understand and determine interpret theoretical properties of AR(I)MA time series models, such as the existence and
uniqueness of solutions, causality and invertibility.
3. Understand and determine descriptive statistics of time series models, such as best linear predictors, autocorrelation function,
partial autocorrelation function, estimators for mean and variance, and spectral density function.
4. Have general fluency in dealing with standard manipulations in time series, such as working with linear filters, computing
conditional expectations and solving recurrence relations.
5. Fit a suitable time series model to time series data and assess the quality of a time series fit.
6. Have basic knowledge of advanced time series models used in financial mathematics, such as GARCH and stochastic
volatility models.
Education Method Lectures/exercises/computer assigment
Books The lectures will be mainly based on:
- Time Series Analysis and Its Applications: With R Examples
By Robert H. Shumway, David S. Stoffer
- Time Series: Theory and Methods
By Brockwell, Peter J. and Davis, Richard A
- Extreme Value Theory: An Introduction
By Laurens de Haan and Ana Ferreira (Chapter 1, 3, 4, 6 and 7)
Assessment There will be a final assessment for the extreme value theory part at the end of Q3 and a final (written or oral) assessment for the
time series part at the end of Q4. (If there are at least 10 students we will organize a written exam.)
There will also be assignments handed out during the course. The final grade can be determined by a weighed sum, with the
following weights:
Page 7 of 20
0.35 exam time series
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 8 of 20
WI4465 Advanced Topics in Probability 6
Responsible Instructor Prof.dr. F.H.J. Redig
Contact Hours / Week 2/2/0/0
x/x/x/x
Education Period 1
2
Start Education 1
Exam Period 1
2
Course Language English
Expected prior knowledge A first course in probability theory (discrete random variables, law of large numbers, central limit theorem) suffices. Knowledge
of stochastic processes like Markov processes and martingales is helpful.
General mathematical maturity (e.g. Bachelor in (Applied) Mathematics) is highly advised.
Course Contents Even years: Random graphs and complex networks.
Random graphs have become one of the most studied models in probability theory. Understanding the behaviour of large
networks (eg. social media, financial transactions, communication networks) has become of paramount importance for
practicioners and scientists alike. In this course, we aim at providing a general theory to answer the most immediate questions
about large random graphs.
We will begin by recalling some tools of discrete probability (moment methods Markov's and Chebyshev's inequality), and learn
some new tools: coupling and stochastic dominations. We will then study ErdosRényi graphs, one of the first examples of
random networks in probability. We will then move on to more modern models of complex networks: configuration model,
preferential attachment graphs, random geometric graphs.
This course provides an introduction in the theory of interacting particle systems, a family of interacting Markov processes used
to model many real-world phenomena such as the spread of an infection, the evolution of opinions in a population, traffic, the
transport phenomena of interacting molecules, temperature-dependent behavior of magnetic systems.
The course treats the following subjects
1. Markov processes in continuous time, generators, semigroups, martingales.
2. Basic techniques of interacting particle systems:
a) Coupling, monotonicity and positive correlations.
b) Duality.
3. These techniques will be applied in the study of the exclusion process, a basic interacting particle system used in non-
equilibrium statistical physics, biology and modelling of traffic.
Course Contents Random Graphs part: The following topics will be discussed. Note that this is only a tentative schedule and the pace at which the
Continuation material will be covered will depend on the class as well.
Real life complex networks and their properties, main modelling approaches.
Probabilistic toolkit: Notions of convergence, coupling, stochastic ordering, Probabilistic bounds (1 lecture)
Branching processes (0.5-1 lecture)
Phase transition in the Erdos-Renyi random graph (1-2 lectures)
Configuration model: (intro 0.5. lecture)
loops and multiple edges (0.5 lecture)
erased configuration model (0.5 lecture)
generating simple graphs
connection to number of simple graphs with prescribed degree sequence (0.5 lecture)
ultra-small world property with power-law degrees. (1 lecture)
Additional topics if time allows might include small world phenomenon, etc.
New models GIRGs, scale-free percolation, etc.
Study Goals Random Graphs part: The general goal is to make the student familiar with the probabilistic toolkit of studying random graphs
and branching processes, and to get to learn the most commonly used network models.
0) The student is able to apply the following probabilistic tools in the setting of random graphs: introducing indicator variables,
Markov's and Chebyshev's inequality, first and second moment method, notions of convergence of random variables, coupling
and stochastic domination.
1) The student knows the definition of Galton--Watson trees and is able to derive properties of the graphs by identifying its
regime (critical, subcritical, supercritical).
2) The students can recognise an Erdos--Rényi random graph and determine the size of its largest components. S/he is able to use
stochastic domination and branching processes to compute expectations and variances of observables of the graph.
3) The student recognizes a configuration model, and knows efficient algorithms to generate them. They can determine and
estimate basic observables: number of edges, self loops, probability of simplicity, and the relation to the number of simple
graphs with given degree sequence.
4) The student knows the definition of preferential attachment models. They can apply martingale methods to determine the
growth of individual degrees of vertices and give a recursion-based proof to determine the degree distribution.
Learning objectives
1. Learn the basic techniques of interacting particle systems and Markov process theory:
generators, semigroups, ergodicity, monotonicity, duality, coupling, graphical representations.
2. Being able to apply these techniques in the symmetric exclusion process and related models.
3. Being able to read a research paper with up to date techniques in this area and give a presentation about it.
Page 9 of 20
Study Goals continuation
Each of the projects requires a deeper understanding and simulation of a random graph model (or a process on it). Each project
has a minimal component that is required for passing grade, but beyond that, the students have the freedom to choose which
direction of the project to explore. In particular, they can run simulations, or go deeper into the theory and study some papers, do
calculations themselves on (partly) open questions related to random graph models.
The students have to hand in a report at the end of the course, at least 5 but no longer than 12 pages. The students also present
their work at the end of the course in a 10 minute presentation. (Each additional page above 12 constitutes to -0.25 grade).
Projects are graded based on content and professional (writing) skills. An empty evaluation form can be found on bright space.
Presentations will be (partially) peer-graded.
Education Method Random Graphs: One lecture per week, to discuss the theory. We will spend every third or fourth lecture on the exercises.
Exercises are provided to get familiar with the material. Beyond this, students are expected to work in pairs or triplets on their
project, write a 10 page report and give a 10 min presentation (in pairs or triplets)
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Co-Instructor J. Komjáthy
Page 10 of 20
WI4615 Stochastic Calculus 6
Responsible Instructor Dr. K. Kirchner
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
Course Language English
Expected prior knowledge Basics of measure theory and analysis (as given in the course ``AM2090 Real Analysis'')
Martingales, Brownian motion (as taught in the master course ``WI4430 Martingales, Brownian Motion'').
Course Contents In this course the following subjects will be studied:
1. Ito integration
2. Ito calculus including Ito's formula
3. Stochastic differential equations
4. Feynman Kac formula
5. Introduction to numerical methods for stochastic differential equations
Study Goals After the course, you should be able to
LO1. Explain the rigorous construction of the stochastic Ito integral.
LO2. Demonstrate the derivation of Itos formula.
LO3. Calculate explicit stochastic Ito integrals by
a) using the construction of the Ito integral,
b) applying Itos formula.
LO4. Prove the existenceuniqueness theorem for stochastic ordinary differential equations (SODEs).
LO5. Assess solvability of concrete SODEs by means of
a) verifying all conditions of the existenceuniqueness theorem,
b) explicitly calculating solutions.
LO6. Establish links between Markov processes and partial differential equations.
Education Method Lectures (2 hours per week).
Literature and Study The course will be based on the following sources:
Materials 1. P.J.C. Sprey: stochastic integration
lecture notes available at https://staff.fnwi.uva.nl/p.j.c.spreij/onderwijs/master/si.pdf
2. Le Gall, Martingales, Brownian motion and stochastic calculus, Springer 2016.
3. Schilling, Partzch, Brownian Motion, An Introduction to Stochastic Processes, De Gruyter, 2010.
4. Karatzas, Shreve, Brownian Motion and Stochastic Calculus, Springer 1991.
5. Øksendal, Stochastic Differential Equations, Springer 2003.
6. Revuz, Yor, Continuous Martingales and Brownian Motion, Springer 1999.
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Permitted Materials during The exam is closed book.
Tests
Page 11 of 20
WI4630 Statistical Learning 6
Responsible Instructor Dr.ir. G.N.J.C. Bierkens
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
5
Course Language English
Expected prior knowledge Required:
* Bachelor level knowledge and ability in calculus, probability, statistics, optimization, linear algebra.
* Programming skills in Python (AM1090). If you do not have this background, consult the book Think Python,
https://greenteapress.com/wp/think-python/, and familiarize yourself with the Numpy and Matplotlib packages.
Recommended:
* Linear Algebra and Optimization for Machine Learning (WI4635)
* Statistical Inference (WI4455)
* Knowledge of measure theory (as, for example, in AM2090)
Course Contents Statistical learning provides a statistical perspective on machine learning.
The overarching goal of the course is to develop methods for estimating (or `learning') an unknown function from data or making
predictions for unseen function outputs. The course aims to empower the student to make a justified decision in adopting
machine learning approaches for practical problems and even design their own machine learning methodology using sound
mathematical principles. This is achieved by studying key ideas, models, algorithms and theories related to the subject of
statistical learning.
In the first half of the course a collection of essential models, algorithms and techniques is introduced. In the second half of the
course we adopt a Bayesian perspective on machine learning and study associated methods for analysing models and developing
computational methodology.
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Study Goals By the end of the course, a student should be able to:
1. Explain the challenges of statistical learning (e.g., curse of dimensionality, overfitting, bias-variance trade-off).
2. Implement machine learning algorithms and computational methods on artificial and real-world data sets (e.g., linear
regression, logistic regression, sparse regression, neural networks, decision trees).
3. Compare the theoretical properties and practical aspects of machine learning approaches.
4. Validate machine learning approaches on arbitrary data sets.
5. Design tailor-made computational approaches for new learning problems by adapting, combining and/or extending methods
and/or by employing a Bayesian approach.
Education Method Lectures, exercises, practical assignments, practice exam
Literature and Study Lecture notes and slides will be made available.
Materials
The main supporting reference is:
Final grade calculation: (0.3 * practical assignments + 0.7 * final written exam)
Note: For both components, a sufficient grade (5.8) is required.
Repair possibilities:
* Assignments: resubmit deliverable
* Written exam: written resit
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 12 of 20
WI4665 Advanced Topics in Statistics 6
Responsible Instructor Prof.dr. A.W. van der Vaart
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
5
Course Language English
Course Contents The subtitle of the course is Causality and Graphical Models
A graphical model expresses dependence relations between variables. Each node of a graph represents
a variable and each edge or arrow a relationship. Next to this structure there is a quantitative model in the
form of a joint probabiity distribution of all variables. The graph expresses conditional (in)dependencies
between the variables given other variables in the sense of probability. Graphical models with directed
graphs are also called Bayesian networks, as Bayes rule gives the conditional probabilities. They are the
basis of probabilistic expert systems, as applied in many contexts.
In practice a graphical model must be learned from data, guided by prior information, and hence there
is a strong statistical content. The graph helps to form a likelihood, and one can apply various statistical
methods, including maximum likelihood and Bayesian methods, to estimate the graph or parameters.
Because there usually are many variables, statistical methods for high-dimensional data and
models can be necessary. This includes for instance efficiency theory for semiparametric models and
penalized likelihood methods.
A main application considered in this course is the discovery and estimation of causal effects from data.
Cause and effect relationships between variables can be graphically represented as arrows in a
graphical model for the variables. Causal discovery is concerned with inferring such a graph from
observed data, and causal estimation with quantifying the effects. The challenge is to perform this
with data that is not produced by a designed experiment, but observational, a situation that abounds
in epidemiological and social science research. As correlation is not causation, corrections must
be made for other possible explanations that might cause a correlation.
Causal reasoning can alternatively be cast in terms of ``potential outcomes'', the effect that would have realised had a random
individual or unit received a treatment of interest. Statistical methods for estimating causal effects can be justified in terms of
potential outcomes. A given causal graph allows to define the relationships of such potential outcomes to other variables, and be
the basis of verifying necessary assumptions on observed data.
The subject of the course is at the intersection of statistics/probability and artificial intelligence.
The style will be mathematical, although only a modest amount of specialised preknowledge is
required, mostly a good grasp of probability and statistical methods, and of course linear algebra
and analysis. We shall provide short introductions to conditioning, asymptotic methods, Bayesian methods, Gaussian models,
etc. when needed.
Study Goals Understand statistical graphical models and their applications, in particular in causal inference.
Education Method Lectures.
Homework problems.
Assessment The final grade of the course consists of the following components:
- Written exam 70 %
- Average grade of homework assignments 30 %
Repair possibilities:
- Written exam: repeat exam.
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Tags Artificial intelligence
Mathematics
Stochastics
Page 13 of 20
Year 2023/2024
Organization Electrical Engineering, Mathematics and Computer Science
Education Master Applied Mathematics
Recommended Courses
Page 14 of 20
WI4050 Uncertainty and Sensitivity Analysis 6
Responsible Instructor Dr. D. Kurowicka
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period none
Course Language English
Expected prior knowledge Basic Probability and Statistics (containing knowledge about joint distributions)
Course Contents During this course we concentrated on the most challenging part of uncertainty analysis: dependence modelling. In particular, on
copula models and their applications in finance and insurance. First part of the course is concerned with the theory of copulas
and their use in regression, time series analysis and factor models. The second part is designed to apply these models to financial
and insurance problems.
Study Goals Students must be able to explain on their own examples concepts such us: uncertainty, independence, and dependence. They
should understand proofs of basic properties of different dependence measures and dependence models. In particular students
should be able to apply concepts us univariate and multivariate uncertainty distributions, copulas. They should be able to justify
which model should be used in the particular situation and explain what are pluses and minuses of different constructions from
the applied point of view.
Students will apply the theory studied in the first part of the course on an individual project.
Education Method Lectures and Projects
Literature and Study Study material on the Brightspace
Materials
Assessment Oral exam after the first part/individual projects (report, presentation) the second part.
In case of an insufficient result, repair opportunities may be offered in accordance with TER Implementation Regulations Art 5,
sub 5
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Course Contents Representation of uncertainty as rational preference, subjective probability, utility, Savages representation theorem,
exchangeability, De Finettis theorem, expert judgment, scoring rules,performance evaluation, applications.
Study Goals 1. Can explain when and why expert judgement is needed
2. Acquire an in-depth understanding of rational decision theory and the mathematical foundations for the use of expert opinion
in science.
3. Be capable of designing an expert elicitation study following the Classical Model setting
4. Be capable of conducting a real-life structured expert judgment elicitation
5. Be capable to analyze the expert data.
6. Be capable to summarize the findings in a concise and accessible report and presentation for peers.
Expected workload:
1. Attending lectures and follow the online material: 28 hours
2. Oral examination: 1 hour
3. Preparing for oral examination: 36 hours
4. Project: 70 hours
5. Oral presentation of projects: 5 h
6. Preparing for lectures, homework assignments: 20 hours
7. Reading papers: 8 hours
Literature and Study Experts in Uncertainty, Oxford. U. Press 1991.
Materials
Assessment The final grade of the course consists of the following components:
- Oral examination after Q3 (40% of the final grade)
- Project and report (40% of the final grade)
- oral presentation after Q4 (10% of the final grade) and short report comparing expert judgment methods in Q4 (10%).
Final grade calculation: (0.4 * oral exam) + (0.4 * project report) + (0.1 * oral exam) + (0.1 * report)
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 15 of 20
WI4156(TU) Game theory 6
Responsible Instructor Dr. R.J. Fokkink
Contact Hours / Week 2/2/0/0
x/x/x/x
Education Period 1
2
Start Education 1
Exam Period 2
3
Course Language English
Required for Open to all MSc students with a strong command of mathematics. Ideally suited for math students with an interest in finance.
Expected prior knowledge Standard BSc math courses: calculus, linear algebra, probability.
Course Contents This is a course in "mathematical" game theory, which is the counterpart of "algorithmic" game theory (agent based modelling,
this part of game theory is taught in the computer science department). We cover all subjects from combinatorial games to
cooperative games.
Study Goals Understanding the mathematical theory of rational behaviour/conflicts/negotiations/social choice. Game theory is one of the
most prominent economic theories: three recent Nobel prizes have been awarded to game theorists.
We will cover a very wide area, ranging from pure mathematics to economics, with an emphasis on the mathematical aspects.
This is a course in the Finance Track of the Applied Maths programme, but it is open to all MSc students (math and non-math).
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Permitted Materials during Standard non graphical calculator.
Tests
Special Information If you are not a mathematics student, feel free to join the course, no need to ask, but be aware that this course is highly
formalized and relies on good mathematical foundation knowledge. This is a course of the master in applied mathematics!
Page 16 of 20
WI4614 Stochastic Simulation 6
Responsible Instructor Dr.ir. L.E. Meester
Contact Hours / Week 0/0/2/2 lecture & 0/0/2/2 projects.
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
5
Course Language English
Expected prior knowledge Martingales, Brownian Motion and Stochastic processes (wi4430 or equivalent course), in particular measure theoretic
probability, and basic statistics, including linear regression. Familiarity with R, Matlab, or equivalent software for simulation, is
recommended.
Course Contents Stochastic simulation: algorithms and analysis. Emphasis in this course is not in the first place on doing simulations, but on
understanding (the workings of) simulation techniques and how simulation can be used to provide insight into stochastic
problems; the most important aspect of the "doing simulations" part is that you can derive the simulation steps/algorithm that
results when one of the covered methods is applied to a problem. Since the key in efficient simulation almost always lies with
the stochastic specifics of the problem, we focus on the stochastic methods that play a role in this process. Some simulations will
be done in the mini-projects, to supplement and illustrate the theory. Here, you are encouraged to "bring your own problem to
work on".
Subjects: introduction, general aspects of stochastic simulation; generating random objects, univariate and multivariate random
variables, and stochastic processes; analysis of simulation output: how to obtain estimates for quantities of interest, as well as
confidence intervals; bias and small sample issues; variance reduction, especially their stochastic background and optimization;
in addition, a selection of some of the following: Quasi-Monte Carlo, discretization methods, estimating derivatives via
simulation.
Study Goals After completing this course successfully, you will be able to
1. explain the general principles of pseudo random-number generators;
2. mention some of their important properties and how they may be assessed;
3. explain and apply the inverse transform and acceptance-rejection methods;
4. simulate, i.e., generate realizations of, random variables and random vectors;
5. do the same for some stochastic processes, among others Brownian motion and the Brownian bridge;
6. process simulation output into estimates plus standard errors/CIs; both for simple and more complex simulation schemes;
7. explain the principles of a number of variance reduction methods: control variates, antithetic variates, stratified sampling,
importance sampling;
8. and apply them to problems of moderate size and complexity;
9. explain what quasi-Monte Carlo methods are and how the van der Corput and Halton sequences are generated;
10. explain some of the advantages and disadvantages of QMC methods, and what can be done about the disadvantages;
11. explain three methods of estimating parameter sensitivities or derivative estimation: finite-difference approximation,
pathwise derivative estimates, the likelihood ratio method;
12. for each of them, work out the specific details of their application for small problems;
In addition:
1. you will acquire some simulation experience, first-hand, including processing and presenting the results;
2. and second-hand by sharing others experience;
3. you will practice developing a critical attitude towards presented simulation results,
4. as well as asking the right questions;
5. you will acquire some insight into Where might I use/deploy what (method or trick) from the simulation toolbox to my
advantage?
Education Method Lectures and a bi-weekly discussion/presentation/problem solving session; homework assignments;self-study.
Literature and Study Paul Glasserman, Monte Carlo methods in Financial Engineering, 2003, ISBN 978-0-387-00451-8.
Materials Additional reference: Stochastic Simulation by Søren Asmussen and Peter Glynn, ISBN 978-0-387-69033-9 (somewhat higher
level, but also more depth).
Prerequisites Martingales, Brownian Motion and Stochastic processes (wi4430 or equivalent course), in particular measure theoretic
probability, and basic statistics, including linear regression. Familiarity with R, Matlab, or equivalent software for simulation, is
recommended.
Assessment The final grade of the course consists of the following components:
- Mini-projects (25%);
- written exam (closed book; 75%);
the grade for the final must be 6 or higher to pass the course.
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 17 of 20
WI4640 High-dimensional Probability 6
Responsible Instructor Prof.dr. F.H.J. Redig
Contact Hours / Week 0/0/2/2
x/x/x/x
Education Period 3
4
Start Education 3
Exam Period 4
5
Course Language English
Expected prior knowledge Basic analysis and linear algebra
Introduction to Probability and Statistics
Basic measure theory (as taught in the course of real analysis)
Course Contents High dimensional probability is the study of random vectors in R^n where the dimension n is large.
This modern research area is important in various applications such as
-Large random structures: random matrices, random networks.
-Statistics and machine learning
-Statistical physics
-Randomized algorithms
-Mixing times and other phenomena in high-dimensional Markov chains
An important aspect of this theory is non-asymptotic probabilistic bounds (concentration inequalities), i.e., inequalities which are
or dimension free or where the role of the dimension is explicit.
The course will mainly focus on various concentration inequalities and applications.
Attention will also be given to Markov semigroups and interpolation methods based on them.
Study Goals 1. Getting acquainted with various concentration inequalities and their use in
random graphs, randomized algorithms, data science. (tested via the homework)
2. Being able to study, make a report and a presentation on a research paper or a chapter of a book on theoretical or applied
aspects of the theory provided during the course (tested via the presentation).
Education Method Quarter 3: lectures based on the books of Vershynin, Von Handel. Lecture notes are provided.
Quarter 4: self-study of the material of the presentation, assistance is provided in the form of offices hours (around the usual
lecture time slot).
Literature and Study
Materials The basic material of the course will be based on the following two sources:
Besides, various papers and books will be made available for the presentations of the students
(these will appear on the brightspace page of the course).
Assessment The final grade of the course consists of the following components:
- Homework
- presentation (or oral exam).
The weighting is
0.4 homework+ 0.6 presentation (or oral exam)
Disclaimer: information may change depending on unforeseen circumstances or measures (see: TER Art 29, sub 4).
Page 18 of 20
Dr.ir. G.N.J.C. Bierkens
Department Statistics
Telephone +31 15 27 84593
Room 36.HB 06.300
Department Statistics
Prof.dr.ir. G. Jongbloed
Unit Elektrotechn., Wisk. & Inform.
Department Statistics
Telephone +31 15 27 85111
Room 36.HB 06.080
Dr. K. Kirchner
Department Analysis
Telephone +31 15 27 85399
Room 36.HB 08.030
J. Komjáthy
Department Applied Probability
Dr. D. Kurowicka
Unit Elektrotechn., Wisk. & Inform.
Department Applied Probability
Telephone +31 15 27 85756
Room 36.HB 07.280
G.F. Nane
Unit Elektrotechn., Wisk. & Inform.
Department Applied Probability
Telephone +31 15 27 84563
Room 36.HB 07.080
Page 19 of 20
Dr. F. Yu
Department Applied Probability
Telephone +31 15 27 85901
Page 20 of 20