289 79 10MB
English Pages 299 [304] Year 1982
de Gruyter Studies in Mathematics 2 Editors: Heinz Bauer · Peter Gabriel
Michel Metivier
Semimartingales a Course on Stochastic Processes
W DE G Walter de Gruyter · Berlin · New York · 1982
Author: Michel Metivier Professor at the Ecole Polytechnique Centre de Mathematiques Appliquees Palaiseau Cedex
Library of Congress Cataloging in Publication Data Metivier, Michel, 1931Semimartingales : a course on stochastic processes. (De Gruyter studies in mathematics ; 2) Bibliography: p. Includes indexes. 1. Stochastic processes. I. Title. II. Series. QA 274. Μ 47 1982 519.2'87 82-17986 ISBN 3-11-008674-3
CIP-Kurztitelaufnahme
der Deutschen Bibliothek
Metivier, Michel: Semimartingales : a course on stochast. processes Michel Metivier. - Berlin ; New York : de Gruyter, 1982. (De Gruyter studies in mathematics ; 2) ISBN 3-11-008674-3 N E : GT
© Copyright 1982 by Walter de Gruyter & Co., Berlin. All rights reserved, including those of translation into foreign languages. No part of this book may be reproduced in any form - by photoprint, microfilm, or any other means - nor transmitted nor translated into a machine language without written permission from the publisher. Typesetting and Printing: Fa. Tutte, Salzweg. - Binding: Lüderitz & Bauer Buchgewerbe G m b H , Berlin. - Cover design: Rudolf Hübler, Berlin. Printed in Germany.
Preface
This book has its origin in courses given by the author in Erlangen in 1976, in lectures given in Berkeley during the summer 1979 and in a course in München in the second semester of 1980. Until recently, many important results in the general theory of stochastic processes, in particular those developed by the "Strasbourgschool", were considered by many probalists as devices only for specialists in the field. It turns out, however, that the growing interest for non-Markovian processes and point processes, for example, because of their importance in modelling complex systems, makes it more and more important for "non-specialists" to be acquainted with concepts such as martingales, semimartingales, predictable projection, stochastic integrals with respect to semimartingales, etc. By chance, the mathematical thinking in the ten past years has produced not only new and sophisticated results but makes it possible to present in a quite concise way a corpus of basic notions and tools, which may be regarded as essential for what is, after all, the goal of many: the description of stochastic systems, the ability to study their behaviour and the possibility of writing formulas and computational algorithms to evaluate and identify them (without mentioning their optimization!). Over the years, the description of stochastic processes was based on the consideration of moments and in particular covariance. A more modern trend is to give a "dynamical" description based on the consideration of the evolution law of the processes. This is perfectly appropriate to the study of Markov processes. In this case the "dynamical structure" of the process leads to equations providing users with formulas and equations to describe and compute its evolution. But more generally one may give a "dynamical description" of a process, Markovian or not, by considering its relation with an increasing family of σ-algebras ( ^ ) i e R + of events, where J^ expresses the information theoretically available until time t. The notion of generator of a Markov process has, in the case of non-Markovian processes, a kind of substitute, which may be expressed in terms of a "Dual predictable projection". In this general setting, the notions of martingales, semimartingales, stopping times and predictability play a fundamental role. Stochastic equations are also appropriate tools for describing general stochastic systems and the stochastic calculus cannot be developed without the same notions of martingales, semimartingales, predictability and stopping times.
VI
Preface
The purpose of this book is precisely to present these fundamental concepts in their full force in a rather concise way and to show, through exercises and paragraphs devoted to applications, what they are useful for. The first part of the book which contains chaps. 1 to 4, is devoted to the exposition of fundamental notions and results for processes. Chapter 1, after giving examples of processes, deals mainly with the two basic concepts of stopping times and predictability. Chapter 2 gives a quite extensive treatment of martingales and quasimartingales as to their regularity and convergence properties. Applications to the study of some stochastic algorithms give some indication of the power of this tool in the study of asymptotic properties. Chapter 3 studies the properties of processes admitting a σ-additive Doleans measure. This point of view provides a general and synthetic understanding of many properties and particularly those tied with Dellacherie's predictable projection and dual predictable projection. The Doob-Meyer decomposition theorem finds here its more natural and easy exposition. We are also able to give the version of this theorem for Banach-valued processes. The introduction of pointprocesses as measure-valued processes at the end of this chapter, is an illustration of the interest of such an extension. Chapter 4 concludes this part of the book with the study of the Hilbert space of square integrable martingales. This is in some sense the "modern dynamic analogue" of the classical theory of second order processes (in the non-stationary situation). Semimartingales are then introduced as the sum of a process which is "locally" a square integrable martingale and a process with paths of finite variation. A recent characterization of these processes through a quite useful domination property closes the chapter. The way is thus opened to stochastic calculus in chapters 5 and 6. The second part is actually devoted to stochastic calculus. Chapter 5 constructs the stochastic integral with respect to a semimartingale. This integral, which is a mapping from processes into processes, is defined as a continuous extension of the integral of elementary predictable processes for seminorms associated with the control processes of the semimartingale. The main properties of the stochastic integral are immediate from this definition. As in the previous chapters, we consider separately the real case and the vector case. The transformation formula, which is the core of stochastic calculus, is then proved in its various forms. Chapter 6 presents the most classical and elementary applications of the transformation formula. It introduces typical "martingale problems" and questions of absolute continuity. The stochastic integral with respect to point processes (more generally, random measures) is studied in chapter 7. The consideration of the point process of jumps of a semimartingale leads naturally to the concepts of local characteristics of a semimartingale. Martingale problems are introduced in their general form at the end of this chapter. Stochastic equations are the subject of chapter 8. The equations considered are generalizations of the Ito-Skorokhod equations, where the driving terms are general
Preface
VII
semimartingales and "white random measures". Strong solutions are proved to exist and to be unique under a quite general Lipschitz-type hypothesis. Non-explosion criteria are given. A pathwise regularity for strong solutions depending on a parameter is proved. This chapter and the book close on a short introduction to the problem of weak solutions of a stochastic equation in simple cases. It is the intention of this book to provide the reader with many examples as close as possible to situations of practical interest and help him through a bibliography covering many extensions and technical developments. To the first goal correspond the numerous exercises and, to the second, a few bibliographical notes at the end of the chapters.
Acknowledgements: The author wishes to express his gratitude to Professors H.Bauer, P.Gabriel and W. Schuder who accepted this book in the new collection edited by de Gruyter. He feels much indebted to many people who helped very much during the preparation of the manuscript: specially Professors E.Wong and H.Kellerer who, in Berkeley and München respectively, offered opportunities of delivering advanced courses on the subject. Thanks are due to many colleagues who expressed valuable criticism on successive drafts and encouragements. It is my pleasure to mention in a very special way Professors K. L. Chung, J. Jacod, J. Pellaumail, C. Dellacherie, P.A. Meyer, G. Letta, G. Da Pratto, J. Groh and S. Orey. A special mention is deserved by Mrs J. Bailleul who did most of the successive typings of the manuscript My thanks are finally due to the staff at de Gruyter for displaying a lot of patience in our final cooperation. Paris, July 1982
Michel Metivier
Contents
Part I: Martingales - Quasimartingales - Semimartingales Chapter 1: Basic notions on stochastic processes 1. Stochastic basic - Stochastic processes 2. Examples and construction of stochastic processes 3. Well-measurable (or optional) and predictable processes 4. Stopping times 5. The σ-algebras and 6. Admissible measures 7. Decomposition theorems for stopping times Exercise and supplements Historical and bibliographical notes
3 5 9 11 17 22 28 32 38
Chapter 2: Martingales and quasimartingales - Basic inequalities and convergence theorem - Application to stochastic algorithms 8. Martingales, submartingales, supermartingales, quasimartingales: elementary properties 9. Doob's inequalities for real quasimartingales and the almost sure convergence theorem 10. Uniform integrability - Convergence in LP - Regularity properties of trajectories 11. Convergence theorems for vector-valued quasimartingales 12. A typical application of quasimartingale convergence theorems: convergence of stochastic algorithms Exercises and supplements Historical and bibliographical notes
40 46 53 60 66 72 79
Chapter 3: Quasimartingales form class [L. D] Predictable and dual predictable projection of processes 13. Doleans measure of an [L. D] - quasimartingale 14. Predictable projection of a process and dual predictable projection of an admissible measure
80 88
X
Contents
15. The predictable F. V. process of an admissible measure on A and the Doob-Meyer decomposition of a quasimartingale Exercises and supplements Historical and bibliographical notes
94 102 110
Chapter 4: Square integrable Martingales and semimartingales 16. Spaces of real L 2 -martingales 17. The first increasing process and orthogonality of L1-martingales 18. TheL 2 -stochastic integral and the quadratic variation of an /Amartingale . 19. Stopped martingales. Inequalities 20. Spaces of Hibert valued martingales 21. The process Μ of a square integrable Hilbert-valued martingale 22. The isometric stochastic integral with respect to Hilbert-valued martingales 23. Localisation of processes and semimartingales Exercises and supplements Historical and bibliographical notes
112 115 120 128 132 136 142 148 158 163
Part II: Stochastic Calculus Chapter 5: Stochastic integral with respect to semimartingales and the transformation formula 24. Stochastic integral in the real case 25. Quadratic variation and the transformation theorem 26. Stochastic integral with respect to multidimensional semimartingales and tensor quadratic variation 27. The transformation formula in the multidimensional case Exercises and supplements Historical and bibliographical notes
168 175 182 188 191 196
Chapter 6: First applications of the transformation theorem 28. Characterizations of Brownian and Poisson processes 29. Exponential formulas and linear stochastic differential equations 30. Absolutely continuous changes of probablity Exercises and supplements Historical and bibliographical notes
198 202 207 212 215
Chapter 7: Random measures and local characteristics of a semimartingale 31. Stochastic integral with respect to a white random measures 217 32. Local characteristics of a semimartingale-Diffusions-Martingale problems.. 231
Contents
Exercises and supplements Historical and bibliographical notes
XI
235 238
Chapter 8: Stochastic differential equations 33. 34. 35. 36. 37.
Examples of stochastic equations - Definitions Strong solutions under Lipschitz hypotheses Conditions for non-explosion Pathwise regularity of solutions of equations depending on a parameter . . . Weak solutions of some stochastic differential equations Exercises and supplements Historical and bibliographical notes
Bibliography Index of notation Index
240 243 252 255 262 267 271 273 284 286
Parti
Martingales - Quasimartingales Semimartingales
Chapter 1 Basic notions on stochastic processes
The aim of this chapter is to give the minimal account of stochastic processes necessary to deal with quasimartingales and their applications. The notions presented here are developed in the books by Meyer and Dellacherie [Mey 1] [Del 1] (DeM 1]. In some sense, what follows is our own self-contained abbreviated account and is extremely close to our presentation in [Met 3].
1. Stochastic basis - Stochastic processes Stochastic basis A stochastic basis is a quadruple (Q,(^Ft)teT, 91, P) where Ω is a set, Τ a subset of IR+,9I a σ-algebra of subsets of Ω, (J^) ( e T an increasing family of sub-a-algebras of 91 (increasing means: s ^teT => c= J5;), and Ρ a probability on (Ω, 91). When 91 is not specified, it is assumed that 91 equals the σ-algebra generated by u denoted by \ J JV When Τ = (R+, the σ-algebra \ J J* is denoted by . ΙεΤ teT telR + The family (J^) ( e r is often called a filtration of Ω. When the filtration is clear, we will abbreviate and write {ZF) or & . instead of . The "physical" meaning of J^ is the following. ist the σ-algebra of events occuring up to time t - the "past events up to t". With the family are associated the following families + + and ( ^ , - ) f e R + of σ-algebras. s >t
- — \J
(the σ-algebra which is generated by (J J^).
s0 of probability transition s and a point χ e 3C we can always contruct a stochastic basis and on this basis a Markovprocess (ATt)(eR* with transitions (7^) and such that X 0 = χ a.s. We can indeed always take Q — 3C R+, for (X t ),^ 0 the projection process and for the associated filtration as in 2.1. By virtue of Kolmogorov's theorem quoted above, a probability Px on (Ω, 21), where 21 is the σ-algebra generated by {X,: t e IR + }, is uniquely defined by the iterated integrals (2.7.2)
$ σ (Μ). We consider /] χ Fe Μ. There exists a sequence (ψη) of continuous real functions φπ on IR+ such that hs,t] =
lim
, /] χ F: s ^ t, Fe (J FT) (J {{0} χ F, Fe JF0} r F ·
As a consequence, we obtain (3.3.1)
σ(3ίχ) c σ{β2) = σ(^ 2 ') c σ{®[) =
Since σ ( ^ ) Γ [ is right-continuous and is moreover regular iff it is adapted. + This last property is equivalent to: Therefore, the proposition is clear. • 4.3 Proposition. Let S and Τ be two stopping times. Then, S ν Τ and S λ Γ are stopping times. If (S„) is a sequence of stopping times, then sup S„ is a stopping time. η Proof: This follows immediately from Proposition 4.2 and the relations l[o,svT[
=
su
1[O.SAT[
=
n
W u pΠS J =
P(l[o,Γ[> l[o,s[)>
i f (1[O,T[> l[o,s[)? SU
ηP V s „ [ ·
•
4.4 Proposition. Every stopping time relatively to the family of σ-algebras is a stopping time with respext to (IF, . Τ is a stopping time with respect to e(p* iff 1[0 T] is predictable or, equivalently, 5 iff {T < /} e J ; for every t. Proof: The first statement of the proposition follows immediately from the definition and the inclusion ^ c= J* +. Let us remark now that, according to the definition of (#",+)> the following properties are equivalent. (a)
{T 5$ /} = Π j r < t + ^J. e
(b)
{T< t}e
for every t.
for every t.
The property (b) is equivalent to the adaptation property for left continuous processes 1[0> r ] and, therefore, according to Theorem 3.3, to the predictability of 1 [0 r ] . • Corollary. Let (S„) be a sequence of stopping times. Then, S — inf»Sn is a stopping time for Proof: The process 1 [0>S] = inf l [0 ,s n ] is in fact predictable, and one applies 4.4.
•
Stopping times
13
4.5 Definition. Let S and Τ be two stopping times with S ^ T. W e define the following stochastic intervals ~\S, T] (resp. [S, Γ ] , [S,T[,
]S, Γ ] ) as follows.
:= { ( / , ω ) : £ ( ω ) < t ^ [5,Γ]:=
{(ί,ω):5(ω)
ωεΩ},
< t ^Τ{ω),ίβ\R+,
[ 5 , Γ [ : = {(/,w):S(w) ^ /
),teR+,
]5,Γ[== {(r,ω):5(ω) < t < Γ(ω),/ε[β+,
coeß}, ωεΩ}, ωεΩ}.
W e use also the following notation. [ Γ ] == {(/,ω): t = Τ (ω) < oo, r e [ R + , a > e ß } . [ Γ ] is known as the graph of the stopping time, but it must be stressed that the stochastic intervals ]S, Γ ] , ]S, Τ [, etc... and [ Γ ] are by definition subsets of IR + x Ω, while Τ and S are allowed to take their values in IR+. 4.6 Proposition. The σ-algebra
of predictable sets is generated by the family of
stochastic intervals of the following form: ]S, Γ ] and {0} χ F, F e
.
ϊ? is also generated by the family of closed stochastic intervals {[0, Γ ] : Τ stopping time}. Proof: According to Proposition 3.3, the R. L.C. processes l 1 S j ] , 1[ο,τ]. 1{o)kf are predictable. Conversely, since, for every Fe^s and s < t, the random variable S ••= s • 1F + t · l C f is a stopping time, the process l ls (| x f can be written 1 ]S t] , where ]£, is a stochastic interval. Following Proposition 3.3, the family of processes {1 ]ST]:S^T,S generates Since
and Τ stopping times} (J {1 { 0 , χ F: Fe
This proves the first part of Proposition 4.6. Γ ] = [0, Γ ] - [ 0 , 5 ] and
with a η Μ stopping time for F e # " 0 , the second part of the proposition follows immediately
from the first. • We will prove later (Proposition 5.7) that ^ is generated by the stochastic intervals [ 0 , Γ [ when the "usual hypotheses" are fulfilled. 4.7 Proposition. The graph [ Γ ] of every stopping time Τ is an optional subset of IR+ χ Ω Proof: It is readily checked that for every η e fM, the random variable
T:= y is a stopping time. Since
*±Ii
14
Basic notions on stochastic processes
η η the optionality of [ Γ ] follows from Proposition 4.2.
•
4 . 8 Remark and example. The question arises from Proposition 4.7 whether [ Γ ] is not only optional but also predictable. We give the following example, supposedly due to Dellacherie, which shows that it is not always the case. We set Ω = [ 0 , 1 ] . For every t < 1, stands for the σ-algebra of subsets of Ω, which is generated by ]/, 1] and the Borel subsets of [0, /]. For every t ^ 1, ^ is the Borel σ-algebra of Ω.
Ω
We define
Τ (ω)
··=
ω.
It is easily seen that Γ is a stopping Because of the definition of J ^ for s < 1, the following equality holds for every Fe and s , l ] = > , 1 ] .
This says that the trace of & on [0, Γ ] is the same as the trace on [0, Γ ] of the family { > , t] χ Ω: s < t ^ 1} |J { { 0 } χ Ω ) . Hence, it follows immediately that the predictable subsets of [0, Γ ] are the intersections [0, Γ ] r> {Β χ Ω), where Β is any Borel subset of [0, 1]. This shows that [ Γ ] cannot be predictable.
Predictable stopping times It should be noted that there exist stopping times with a predictable graph. Constant stopping times are the simplest examples. This example and Remark 4.8 give full meaning to the following definition. 4 . 9 Definition. A stopping time Τ is called predictable when [ Γ ] is a predictable subset of [R + χ Ω.
Stopping times
15
As an example of predictable stopping times which are not constant, let us consider the random variable Τ + α, where Τ is any stopping time and α a strictly positive number. It is readily checked that Τ + α is a stopping time and from
follows
and therefore, the predictability of Τ + α. At this point it should be emphasized that a stopping time with finitely many values is not necessarily predictable (see exercice 5). 4.10 Proposition. A stopping time Τ is predictable iff^0,T equivalent to the predictability of [0,
[ « predictable. This is also
Γ[.
Proof: Since ]0, Γ ] is predictable for every stopping time 7"(Proposition 4.4), ]0, Γ [ = ]0, Γ ] - [ Γ ] is also predictable if and only if [ Γ ] is predictable. Since { T > 0} e we have { 0 } x { T >
Therefore, the stochastic intervals [ 0 , Γ [ and ]0, Γ [
are predictable or not at the same time.
•
4.11 Proposition. Let S and Τ be two predictable stopping times. Then, S Λ Τ and S ν Τ are also predictable stopping times. For every sequence {Tn) of predictable stopping times, su^ T„ is a predictable stopping time. For every stopping time T, there exists a decreasing sequence (T„) of predictable stopping times such that T=
inf Tn.
Proof: The two first statements of this proposition are a straightforward consequence of the following equalities and of Proposition 4.10 [0,5 a T [ =
[0,5[η[0,Γ[
[ 0 , 5 ν Γ [ = [0, S [ η [0, Τ [ [0, sup Γ„[ =
υ
[0'Ή·
As already noticed, the random variables T„ ··= Τ Η — are predictable stopping times. This proves the third statement.
•
4.12 Corollary. The σ-algebra ί? is generated by the family {[0, Τ [ : Τ predictable stopping time}
16
Basic notions on stochastic processes
of stochastic intervals. The following family is also a system of generators {]S, r [ , S and Τ predictable stopping times, S ^ T } .
Beginning of a set 4.13 Definition. Let A be a subset of [R+ χ Ω. We call the beginning of A the d e valued random function DA defined by r\(\ Δω):
finf {ί: (ω, r) e Λ, / e iR + } if {/: (a>, /) e ,4, / e [R + } Φ 0 ; + l+oo H{t:((o,t)eA,teR } =0
When the "usual hypotheses" (see Definition 1.1) are fulfilled, the following proposition appears as a trivial consequence of Proposition 4.15 below and its Corollary 4.16. However, we think it is interesting to mention the following elementary property of beginnings, which holds without any special assumption about the filtration ('^Lr* ·
4.14 Proposition (1) Let X be a real, adapted, continuous process indexed by iR+ and A — {X = 0}. Then, the beginning of A is a predictable stopping time. (2) Let X be an R. R. C. process indexed by (R+ with values in [Rd and G be an open set. The gebinning of the set A — {(/, ω); X (t, ω) φ G} is a stopping time for the filtration (JV)· Proof: (1) The continuity of X implies {DA>t}={J
πεΝ
Π ίΐ*|24}. r oo a) If(T„) is decreasing and if = J V for every ts fR+, then ^ τ = Π &τΛ πεΗ
·
b) If {Tn) is increasing, then ^τ - = V
'
Proof: (1) Let S and Τ be two stopping times with S < 7\ From the equality Fn{T^t} = Fr\{S ^t} η {T ^t}, it immediately follows that a Since F n { / < S } = i ' n { i < S } n { / < r } , we immediately have the following implications
Hence, Fs c-