Objectif : Les cours Bachelier sont des cours de formation doctorale ouverts également aux chercheurs et aux praticiens désireux de suivre un cours spécialisé de haut niveau dans le domaine des mathématiques financières.
Périodicité et lieu : Les cours ont lieu à l’Institut Henri Poincaré entre 9h à 11h, selon le programme indiqué dans Programme, avant le séminaire Bachelier.
les vendredis 08/11, 15/11 et 29/11
Abstract of Mini-Course 1 : The search theory of OTC Markets
Some of the largest, and most important, financial markets do not organize trade via a centralized exchange running continuous all-to-all auctions, but rather operate through a decentralized or over-the-counter (OTC) structure that entails uncertain terms of trade and execution times. For example, the market value of U.S. assets traded OTC in 2018—which include treasury and corporate bonds, as well as interest rate swaps, repurchase agreements, and Federal Funds—amounted to $50 Trillion, close to 1.5 times the corresponding figure for centrally traded securities.
Understanding how these markets work and adapt to new regulation has become increasingly important, as illiquidity in certain OTC markets has appeared as the first signs of trouble—if not the cause—of the last two financial crises. In this course, I will take students on guided tour of the search and bargaining approach to the modelling of OTC markets that was pioneered by Duffie, Garleanu, and Pedersen (2005, 2007). In particular, I plan to cover equilibrium models of semi-centralized markets where trade is intermediated by dealers, equilibrium models of pure decentralized markets where trade only occurs in bilateral meetings, and extensions of these frameworks that allow for asymmetric information and/or alternative price setting mechanisms such as request-for-quotes and directed search.
The course is based on the forthcoming textbook The economics of over-the-counter markets (2025) co-authored with Ben Lester (Federal Reserve of Philadelphia) and Pierre Olivier Weill (UCLA). It can be followed by any student with a working knowledge of Poisson processes and continuous-time stochastic optimal control.
Detailed contents:
Lecture 1. (November 8, 2024)
Lecture 2. (November 15, 2024)
Lecture 3. (November 29, 2024)
les vendredis 12/01, 26/01, 02/02, 09/02 et 07/06
Abstract of Mini-Course 1 : Signature methods in finance
Path signature were introduced by K. T. Chen in the 50s for smooth paths and later extended to rough paths by T. Lyons in the 90s -- in fact, they form the basic building block of Lyons' theory of rough paths. Signatures are a convenient and efficient way to encode paths, i.e., functions from, say, [0,T] to R^d. Indeed, 1. a path's signature essentially characterizes the underlying path, and 2. the set of linear functionals of the signature forms a sub-algebra of the continuous functions of paths. As a consequence, (linear or non-linear functionals of) signatures form a natural set of basis functions for approximation of functions on path-space, similar to the role of polynomials on finite-dimensional Euclidean space. In particular, they are universal approximators. In this minicourse, we will provide a gentle introduction into signatures and hint at their role in rough path theory. After that, we will study the signature's role in statistical learning applications for time-series, in particular introducing the signature kernel. Finally, we show how standard numerical algorithms for solving optimal stopping problems when the underlying process has the Markov property can be extended to the non-Markovian case using the signature.
les vendredis 22/03 et 29/03
Abstract of Mini-Course 2 : Signature methods in finance
Signature methods represent a non-parametric way for extracting characteristic features from time series data which is essential in machine learning tasks. This explains why these techniques become more and more popular in econometrics and mathematical finance. Indeed, signature based approaches allow for data-driven and thus more robust model selection mechanisms, while first principles like no arbitrage can still be easily guaranteed. In this course the main focus lies on the use of signature as universal linear regression basis of certain continuous functionals of paths for financial applications. In the applications that we have in mind one key quantity is the expected signature of some underlying stochastic process which has to be computed efficiently. Surprisingly this can be achieved for generic classes of diffusions, called signature-SDEs (with possibly path dependent characteristics), via techniques from affine and polynomial processes. More precisely, we show how the signature process of these diffusions can be embedded in the framework of affine and polynomial processes. Following this line we show that the infinite dimensional Feynman Kac PDE of the signature process can generically be reduced to an infinite dimensional ODE either of Riccati or linear type. In terms of concrete financial applications, we shall treat two main topics: stochastic portfolio theory and signature based asset price models. In the context of stochastic portfolio theory we introduce a novel class of portfolios which we call linear path-functional portfolios. These are portfolios which are determined by certain transformations of linear functions of a collections of feature maps that are non-anticipative path functionals of an underlying semimartingale. As main example for such feature maps we consider signature of the (ranked) market weights. Relying on the universal approximation theorem we show that every continuous (possibly path-dependent) portfolio function of the market weights can be uniformly approximated by signature portfolios. Besides these universality features, the main numerical advantage lies in the fact that several optimization tasks like maximizing expected logarithmic utility or mean-variance optimization within the class of linear path-functional portfolios reduces to a convex quadratic optimization problem, thus making it computationally highly tractable. We apply our method to real market data, indicating out-performance on the considered out-of-sample data even under transaction costs. In view of asset price models we consider a stochastic volatility model where the dynamics of the volatility are described by a linear function of the signature of a primary process, which is supposed to be some multidimensional continuous semimartingale. Under the additional assumption that this primary process is of polynomial type, we can express both the log-price and the VIX squared as linear functions of the signature of the appropriately augmented process. This feature can then be efficiently used for pricing and calibration purposes. Indeed, as the signature samples can be easily precomputed, the calibration task can be split into an offline sampling and a standard optimization. For both the SPX and VIX options we obtain highly accurate calibration results, showing that this model class allows to solve the joint calibration problem without adding jumps or rough volatility. In the final part of the course we rely on the approximation properties of the signature to derive a (rough) functional Itô formula for non-anticipative path functionals. This lead to a functional extension of the classical Itô formula for rough paths which coincides with the functional change of variable formula formulated by Cont and Fournie (2010), whenever the notions of integration coincide. As a byproduct, we show that sufficiently regular non-anticipative path functionals admit a functional Taylor expansion, leading to an extension of the recently established results of Dupire and Tissot-Daguette (2022). The course is based on several joint works with Guido Gazzani, Xin Guo, Janka Möller, Francesca Primavera, Sara-Svaluto Ferro and Josef Teichmann.
les vendredis 26/04, 03/05 et 17/05
Abstract of Mini-Course 3 : Functional convex ordering of stochastic processes : a constructive approach with applications to Finance
Convex order between two integrable vectors U and V having values in R^d is defined by IE f(U) <= IE f(V) for every convex function f: R^d --> R (with some variants like monotonic convex order in dimension d=$ where f is also supposed monotonic). After recalling the deep connections of such an ordering with martingality through Kellerer and Strassen theorems and "p.c.o.c." (processus croissants pour l'ordre convexe aka "peacocks" following the terminology introduced by Yor et al.), we will expose some first applications to Finance (sensitivity to volatility of options written on convex payoffs, risk measure, etc). Then we will come to the core of the course which is to introduce functional convex order i.e. to extend, when dealing with stochastic processes (X_t), the above definition to (convex) functionals F((X_t)_t) of the whole trajectories. The comparison is also based on functional "hyper-parameters" (typically the diffusion coefficient \sigma in the case of martingale or scaled Brownian diffusions). We will investigate various classes of stochastic processes, first belonging to Brownian diffusions and then to other classes of processes like diffusions with jumps, diffusion of McKean Vlasov type but also non-Markovian processes, such as the solutions of Volterra equations with (possibly) singular kernels like those appearing in rough volatility modeling in Finance. On our way we will revisit, unify and extend or discuss former results from the literature like the old Hajek's theorems for monotone marginal convex ordering of one dimensional diffusions, or more recent contributions by Rüschendorf and coauthors, Hobson, Schied and Stadje among others. A typical result for scalar martingale Brownian diffusions with respective non-negative diffusions coefficients sigma_1 and sigma_2 if \sigma or \theta_2 is convex & \sigma <= \theta then IE F((X^{\sigma}_t)_t) <= IE F((X^{theta}_t)_t) for every lower semi-continuous convex functional $F$. Such results can make possible to compare/bound European option prices in different models in Finance in the spirit of the seminal paper by El Karoui-Jeanblanc-Shreve. Similar results hold for non linear problems like American options by adapting our approach to optimal stopping theory, stochastic control (and swing option pricing on energy) or to mean-field games when dealing with McKean-Vlasov equations. In the first two cases we rely on a Backward Dynamic Programming Principle. We will highlight how convex ordering is closely related to a problem of (at least) equivalent importance for applications: the propagation of convexity which can be summed up as the fact that, if a functional F is convex, x--> E F((X_t)_t) is also convex (under appropriate assumtpions). We will systematically establish both our comparison and propagation results starting from a discrete time approximation procedure of Euler scheme type, generally simulable and concluding by appropriate strong or weak functional limiting theorems à la Jacod-Shiryaev. Among other virtues, this approach makes it possible in Finance to ensure that the prices of derivative products computed by simulation cannot give rise to convexity arbitrages since our approximations share the same convexity properties. If time is not too short we will make a focus on one dimensional Brownian diffusions where the convexity assumption on the (one of the) diffusion coefficient can be relaxed and the class of admissible functionals extended to directionally convex functionals. A bibliography will be joined to the slides.
les vendredis 14/04 et 21/04
Abstract of Mini-Course 1 : From Diamonds to Signatures (14 April 2023, 9h00-11h00):
Hairer's perturbative approach to the KPZ equation (with mollified noise) known as Wild's expansion has a non-Markovian pendant, that relates non-trivially to the so-called diamond expansions of Alos-Gatheral-Radoičić (20) and the Lacoin-Rhodes-Vargas cumulant formula (22). We will provide a common generalization of these works, clarify their relation through forest reordering, and given an application to rough forward variance models. (Based on F-Gatheral-Radoičić, "Forests, cumulants, martingales." Ann. Probab. 2022). The second part is devoted to signatures, a beautiful piece of mathematics, that has gained much recent attention in finance (Buehler, Cuchiero, Dupire, Lyons, Teichmann ...). More specifically, I will present a non-commutative generalization of the afore-mentioned expansions, based on a type of infinite-dimensional Ricatti equation. This allows to compute recursively the logarithm of the expected signatures, a.k.a. signatures cumulants, in general semi-martingale models. (Based on F-Hager-Tapia. "Unified signature cumulants and generalized Magnus expansions." Forum Math. Sigma 2022.) Time permitting, I will also report on a related ongoing work with A. Seigal (Harvard) and T. Lyons (Oxford) on the non-polynomial nature of such expansions.
Abstract of Mini-Course 2 : Rough Path Pricing in Local Stochastic Volatility Models (21 April 2023, 9h00-11h00):
LSV models, widely used in the industry, are specified by their (possibly non-Markovian) SV process together with a leverage function. We describe partially conditioned dynamics via explicit (Markovian) rough stochastic differential equation. The conditional pricing problem can then treated by rough PDE methods. Upon suitable randomization of the driving rough paths and subsequent averaging, we return to the original pricing problem. This can be seen as extension / refinement of Romano-Touzi (97) and a recent SPDE pricing method for rough volatility of Bayer et al. Some rough idea about rough paths will be helpful, but we will introduce in the lecture everything we need. Ongoing joint work with P. Bank, C. Bayer and L. Pelizzari (all Berlin).
Les Mardis à Campus Jussieu, salle 104 barre 15-25.
Introduction to Entropic Optimal Transport
19 Octobre | 26 Octobre | 23 Novembre | 30 Novembre | |
---|---|---|---|---|
9h00-10h00 | Marcel Nutz C1 | Marcel Nutz C3 | Marcel Nutz C5 | Marcel Nutz C7 |
10h15-10h55 | Pierre Bras | Katarina Eichinger | Roberta Flengh | Mehdi Talbi |
10h55-11h35 | Mohan Yang | Geoffrey Derchu | Songbo Wang | Giacomo Greco |
11h50-12h50 | Marcel Nutz C2 | Marcel Nutz C4 | Marcel Nutz C6 | Marcel Nutz C8 |
13h00-14h30 | Déjeuner | Déjeuner | Déjeuner | Déjeuner |
les vendredis 01/04, 08/04 (matin et après-midi) et 15/04
Recent developments in interest rate modelling
Introduction:
In the last 15 years the fixed-income markets have witnessed some profound changes. The credit crisis in 2007–2008 and the Eurozone sovereign debt crisis in 2009–2012 had a strong impact on these markets: the emergence of multiple interest rate curves depending on the tenor and the presence of persistently low and negative interest rates have irreversibly changed the way the fixed-income markets function in practice, as well as the way their theoretical models were developed. Moreover, a few years ago a major reform was undertaken, and is still ongoing, with the goal of providing more robust and more reliable benchmark rates to replace the LIBOR rates as reference rates in financial transactions.
Developing pertinent and up-to-date models for the dynamics of the term struc- ture of interest rates presents not only mathematical challenges due to the complex- ity and high-dimensionality of the problem, but is also of utmost importance for the financial industry. Fixed-income instruments represent by far the largest portion of the global financial market: according to the statistics provided by the Bank for International Settlements, the notional amounts outstanding for over-the-counter (OTC) interest rate derivatives sum up to several hundreds trillions of US dollars, which is more than three quarters of the total trade volume in OTC derivatives. The goal of this course is to provide an overview of the state-of-the art techniques used in modern interest rate modeling, addressing the above-mentioned challenges.
Outline of the course:
Lecture notes: Cours 01/04 ⬇
Abstract of the lecture by Fabio Mercurio (8 April 2022, 15.15 - 17.15):
Interbank offered rates (IBOR) were, and still are, the key reference rates in many financial products with the total market exposure worldwide of hundreds of trillions of US dollars. Because of the inverted pyramid effect, as well as the reported cases of LIBOR manipulation, a few years ago, global authorities prompted the creation of new, more robust risk-free rate (RFR) benchmarks to be used in the majority of financial transactions. In the first part of this seminar, we’ll review some of the main decisions and actions taken by regulators and market participants to transition away from LIBOR.
In the second part of the seminar, we’ll define and model forward risk-free term rates, which appear in the payoff definition of derivatives and cash instruments based on the new RFR benchmarks. We show that the classic interest-rate mod- eling framework can be naturally extended to describe the evolution of both the forward-looking (IBOR-like) and backward-looking (setting-in-arrears) term rates using the same stochastic process. We then introduce an extension of the LIBOR Market Model (LMM) to backward-looking rates. This extension, which we call generalized forward market model (FMM), completes the LMM by providing addi- tional information about the rate dynamics between fixing/payment times, and by implying dynamics of forward rates under the classic money-market measure.
les vendredis 03/06, 10/06, 17/06 et 24/06
Quantum Computing for Mathematical Finance
Introduction :
Quantitative Finance is a rapidly changing environment, and the financial industry is always on the lookout for new techniques and new technologies able to harness the rise of big data and the availability of computing power.
Quantum computing, though not a recent field, has gained huge popularity in the past few years with the development of small-scale quantum computers and quantum annealers. These have in turn set directions for new algorithms, hybrid between classical and quantum, and tailored for such computers. The financial industry is now looking at such developments and there is common agreement that this will be one of the leading advances in the coming decade.
The goal of this Bachelier course is to provide an introduction to this new technology and these new algorithms and show them how they can be used to solve financial problems, in particular for portfolio optimisation, in the context of Machine learning and neural network and for PDE solving.
Plan : The course will tentatively walk along the following steps:
Each lecture will focus on the theoretical aspects of the problem and the related algorithms and will further show some numerical examples in Jupyter notebooks.
les vendredis 10/01, 24/01, 07/02 et 14/02
Economics of climate change and Green Finance
les vendredis 28/02, 06/03, 13/03 et 20/03
Longévité
Part 1: Point processes in random environment and application to the study of longevity risk, Sarah Kaakai (LMM, Université du Mans).
Point processes have received much attention in the recent years. Indeed, their flexibility allows for the modeling of a wide range of phenomena, in various fields including for instance biology, finance, insurance, population dynamics or neurosciences.
In this course, I will first present a general overview on representations of point processes with stochastic intensity, with a particular focus on pathwise representations as solution of stochastic differential equations with random coefficients, driven by Poisson random measures. This viewpoint is particularly well-suited to the study of interacting events occurring in a random environment, and of non-Markovian systems. I will show how strong comparison results can be derived from pathwise representations, as well as straightforward tightness results.
In a second part, I will apply these results in order to introduce a new class of heterogeneous population dynamics in random environment. Individuals (or companies) can enter or exit the population, as well as change characteristics, at stochastic rates depending on the whole population and on the random environment. I will show on this model how averaging results can be obtained for point processes in random environment, here when changes of characteristics occur at a faster timescale than entries and exits from/to the population. In particular, I will give a brief overview on the stable convergence, which is a powerful tool naturally extending the convergence in distribution in the presence of a random environment.
Finally, I will illustrate how such averaging results allow us to generate more realistic mortality models than standard demographic tools based on linear models, reflecting the heterogeneity of the underlying population and taking into account the macro environment.
Part 2: Longevity risk and quickest detection problem: from theory to practice, Nicole El Karoui (LPSM, Sorbonne Université) and Stéphane Loisel (ISFA, Université Lyon 1).
After recalling briefly key features of longevity risk,, we explain how to detect as quickly as possible the date where that the actuarial assumptions related to longevity risk are no longer valid.
The problem is put as a quickest detection problem. We introduce the so-called cusum process and show its optimality for a generalized Lorden criterion. We then explain how to design Key Risk Indicators thanks to the cusum process. We analyze its advantages and drawbacks for longevity risk monitoring, as well as for some other insurance risks. The method is illustrated on simulated and real-world case studies.
Lecture notes: Cours 28/02 ⬇, Cours 06/03 ⬇, Cours 13/03 ⬇
Annulé
les vendredis 08/02 et 15/02
Rough volatility
les vendredis 29/03, 05/04, 12/04 et 19/04
Auctions in the Energy Sector: An Introduction and Survey
OUTLINE:
les vendredis 07/06 (G. Peyré) et 21/06 (M. Cuturi)
Optimal Transport & Machine Learning
les vendredis 26/01, 02/02, 09/02 et 16/02
Analyse XVA
Lecture notes: Part 1, Part 2.
ABSTRACT: Since the crisis, derivative dealers charge to their clients various add-ons, dubbed X-valuation adjustments (XVAs), meant to account for counterparty risk and its capital and funding implications.
XVAs deeply affect the derivative pricing task by making it global, nonlinear, and entity dependent. However, before the technical implications, the fundamental points are to understand what deserves to be priced and what does not, and to establish, not only the pricing, but also the corresponding collateralization, dividend, and accounting policy of a bank.
If banks cannot replicate jump-to-default related cash ows, deals trigger wealth transfers from bank shareholders to creditors and shareholders need to set capital at risk. On this basis, we devise a theory of XVAs, whereby so-called contraliabilities and cost of capital are sourced from bank clients at trade inceptions, on top of the fair valuation of counterparty risk, in order to compensate shareholders for wealth transfer and risk on their capital.
The resulting all-inclusive XVA add-on, to be sourced from clients incrementally at every new deal, reads (CVA + FVA + MVA + KVA), where C sits for credit, F for funding, M for (initial) margin, and where the KVA is a cost of capital risk premium. This formula corresponds to the cost of the possibility for the bank to go into run-off, while staying in line with shareholder interest, from any point in time onward if wished.
Moreover, economic capital (EC) can be used as a funding source by banks, at a risk-free cost instead of the additional credit spread of the bank in the case of unsecured borrowing. This intertwining of EC and FVA leads to an anticipated BSDE (backward stochastic differential equation of the McKean type) for the FVA, with coefficient entailing a conditional risk measure of the one-year-ahead increment of the martingale part of the FVA itself.
Our XVA equations are solved by projection on a reduced filtration myopic to the default of the bank, the latter being assumed to be an invariance time as per Crépey and Song (2017). This assumption, which covers mainstream immersion setups (but not only), expresses the consistency of valuation across different trading desks with different focuses within the bank: the XVA desks versus the different business desks.
Finally, we present a nested Monte Carlo approach implemented on graphics processing units (GPU) to XVA computations. The overall XVA suite involves five compound layers of nested dependence. Higher layers are launched first and trigger nested simulations on-the-fly whenever required in order to compute a metric from a lower layer. With GPUs, error controlled nested Monte Carlo XVA computations are within reach. This is illustrated on XVA computations involving equities, interest rate, and credit derivatives, for both bilateral and central clearing XVA metrics.
Course material: Related papers on https://math.maths.univ-evry.fr/crepey/.
les vendredis 23/03 et 30/03
Diffusions en interaction de champ moyen suivant le rang
ABSTRACT: Les particules diffusives interagissant suivant le rang permettent de modéliser les capitalisations boursières dans la théorie du portefeuille stochastique de Fernholtz. Nous nous intéresserons au cas particulier où l'interaction est en outre de type champ moyen : les coefficients de diffusion et de dérive de chaque coordonnée (ou particule) dépendent de la fonction de répartition empirique de l'ensemble des particules calculée en cette coordonnée. Nous étudierons tout d'abord la limite de champ moyen où le nombre de coordonnées tend vers l'infini. La fonction de répartition empirique tend alors vers la fonction de répartition de la loi marginale de l'équation différentielle stochastique limite qui est non linéaire au sens de McKean. Nous nous intéresserons ensuite au comportement en temps long de cette diffusion non linéaire en exploitant que la statistique d'ordre des particules est une diffusion à coefficients constants normalement réfléchie à la frontière du simplexe.
Nous interprèterons cette limite en temps long en théorie du portefeuille. Enfin, nous montrerons que la limite petit bruit du système de particules est donnée par la dynamique des particules collantes et nous étudierons la limite de cette dynamique lorsque le nombre de particules tend vers l'infini.
les vendredis 06/04 et 13/04
Control of McKean-Vlasov equations
ABSTRACT: This lecture is concerned with the optimal control of McKean-Vlasov equations, which has been knowing a surge of interest since the emergence of the mean-field game theory. Such problem is originally motivated from large population stochastic control in mean-field interaction, and finds various applications in economy, finance, or social sciences for modelling motion of socially interacting individuals and herd behavior. It is also relevant for dealing with intermittence questions arising typically in risk management.
In the first part, I focus on the important class of linear-quadratic McKean-Vlasov (LQMcKV) control problem, which provides a major source for examples and applications. We show a direct and elementary method for solving explicitly LQMcKV based on a mean version of the well-known martingale optimality principle in optimal control, and the completion of squares technique. Variations and extensions to the case of infinite horizon, random coefficients and common noise are also addressed. Finally, we illustrate our results with an application to a model of interaction between centralised and distributed generation.
The second part is devoted to the presentation of the dynamic programming approach (in other words, the time consistency approach) for the control of general McKean-Vlasov dynamics. In particular, we introduce the recent mathematical tools that have been developed in this context: differentiability in the Wasserstein space of probability measures, Itô formula along a flow of probability measures and Master Bellman equation. Extensions to stochastic differential games of McKean-Vlasov type are also discussed.
les vendredis 18/05, 25/05, 01/06 et 15/06 (10h-11h)
Théorie des jeux : Outils de base et application aux réseaux sans fil et réseau d’électricité
les vendredis 18/11, 25/11 et 2/12
Financial Intermediation at Any Scale for Quantitative Modelling
ABSTRACT: During this series of lectures, we will go from the role of the financial system described as a large network of intermediaries to a fine description of high frequency market makers. The role of regulation in the recent transformations of participants’ practices will be exposed too. The viewpoint taken is the one of a practitioner or a researcher who has to put in place models. Existing models will be reviewed, and new challenges and the stakes of possible improvements will be discussed. Important stylized facts and important mechanisms that models should reproduce will be exposed.
Outline:
MAIN REFERENCES:
ABOUT THE AUTHOR:
Charles-Albert Lehalle is Senior Research Advisor at Capital Fund Management (CFM, Paris) and a member of the CFM-Imperial Institute of Quantitative Finance. He was formerly Global Head of Quantitative Research at Crédit Agricole Cheuvreux, and Global Head of Quantitative Research on Market Microstructure in the Equity Brokerage and Derivative Department of Crédit Agricole Corporate Investment Bank.
With a Ph.D. And an HDR in applied mathematics Charles-Albert Lehalle lectures at the Pierre et Marie Curie "Probability and Finance" and the MASEF/ENSAE Masters in Paris.
Since the financial crisis, Charles-Albert has studied market microstructure evolution and regulatory changes in Europe and the US, and has provided research and expertise on these topics to investors, intermediaries and policy-makers such as the European Commission, the French Senate and the UK Foresight Committee. He has been a member of the Consultative Workgroup on Financial Innovation of the European Market Authority (ESMA) and is part of the Scientific Committee of the French regulator (AMF). Besides, he chairs Euronext’s Index Advisory Group, working on topics like Smart Beta and Factor Investing.
He has published many academic papers about the use of stochastic control and stochastic algorithms to optimize trading flows with respect to flexible constraints. He has also authored papers on post-trade analysis, market impact estimation and modelling the dynamics of limit order books. He co-authored the book "Market Microstructure in Practice".
and co-edited the book "Market Microstructure: Confronting Many Viewpoints", being co-organizer of the eponymous conference taking place every even year in December in Paris. Charles-Albert is one of the managing editors of the “Market Microstructure and Liquidity” academic journal.
les vendredis 7/4, 21/4, 28/4 et 5/5
Continuous time contract theory models
ABSTRACT: We consider a number of models involving two parties, a principal and an agent. In practice, the principal can be the owner of a firm and the agent can be a manager that is hired to run the firm's operations. The two parties may share the same information or not. The first of these two cases gives rise to a risk sharing problem in which the principal optimally determines the precise actions that the agent has to follow. The second one gives rise to a problem that may involve moral hazard in the sense that the agent can take actions that are not in the best interest of the principal. We develop a complete analysis of the models we consider, with emphasis on the important ideas underlying their analysis. The four lectures are structured to be relatively independent.
les vendredis 12/5 et 19/5
Equilibrium models with frictions
ABSTRACT:
Part I: Equilibrium Liquidity Premia (Joint work with Bruno Bouchard, Masaaki Fukasawa, and Martin Herdegen) We study equilibrium returns in a continuous-time model, where heterogenous mean-variance investors trade subject to quadratic transaction costs. We show that the unique equilibrium is characterised by a system of coupled but linear forward backward stochastic differential equations. Explicit solutions obtain in a number of concrete settings. The corresponding liquidity premia compared to the frictionless case are mean reverting; they are positive if the more risk-averse agents are net sellers.
Part II: A Risk-Neutral Equilibrium Leading to Uncertain-Volatility Pricing (joint work with Marcel Nutz)
We study the formation of derivative prices in equilibrium between risk-neutral agents with heterogeneous beliefs about the dynamics of the underlying. Under the condition that the derivative cannot be shorted, we prove the existence of a unique equilibrium price and show that it incorporates the speculative value of possibly reselling the derivative. This value typically leads to a bubble; that is, the price exceeds the autonomous valuation of any given agent. Mathematically, the equilibrium price operator is of the same nonlinear form that is obtained in single-agent settings with strong aversion against model uncertainty. Thus, our equilibrium leads to a novel interpretation of this price.
Propagation of uncertainty
les vendredis 15/01, 22/01, 29/01 et 12/02
An introduction to Feynman-Kac integration and genealogical tree based particle models
Slides: Lecture 1, Lecture 2, Lecture 3, Lecture 4.
Labs (.sce): Lab 1, Lab 2.