Empirical Processes in M-Estimation (Cambridge Series in Statistical and Probabilistic Mathematics, Series Number 6) [Illustrated] 0521123259, 9780521123259

The theory of empirical processes provides valuable tools for the development of asymptotic theory in (nonparametric) st

132 36 14MB

English Pages 300 [302] Year 2009

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Applications of Empirical Process Theory
Series
Title
Copyright
Contents
Preface
Guide to the Reader
1 Introduction
1.1. Some examples from statistics
1.2. Problems and complements
2 Notation and Definitions
2.1. Stochastic order symbols
2.2. The empirical process
2.3. Entropy
2.4. Examples
2.5. Notes
2.6. Problems and complements
3 Uniform Laws of Large Numbers
3.1. Uniform laws of large numbers under finite entropy with bracketing
3.2. The chaining technique
3.3. A maximal inequality for weighted sums
3.4. Symmetrization
3.5. Hoeffding's inequality
3.6. Uniform laws of large numbers under random entropy conditions
3.7. Examples
3.8. Notes
3.9. Problems and complements
4 First Applications: Consistency
4.1. Consistency of maximum likelihood estimators
4.2. Examples
4.3. Consistency of least squares estimators
4.4. Examples
4.5. Notes
4.6. Problems and complements
5 Increments of Empirical Processes
5.1. Random entropy numbers and asymptotic equicontinuity
5.2. Random entropy numbers and classes depending on n
5.3. Empirical entropy and empirical norms
5.4. A uniform inequality based on entropy with bracketing
5.5. Entropy with bracketing and asymptotic equicontinuity
5.6. Modulus of continuity
5.7. Entropy with bracketing and empirical norms
5.8. Notes
5.9. Problems and complements
6 Central Limit Theorems
6.1. Definitions
6.2. Sufficient conditions for 3 to be P-Donsker
6.3. Useful theorems
6.4. Measurability
6.5. Notes
6.6. Problems and complements
7 Rates of Convergence for Maximum Likelihood Estimators
7.1. The main idea
7.2. An exponential inequality for the maximum likelihood estimator
7.3. Convex classes of densities
7.4. Examples
7.5. Notes
7.6. Problems and complements
8 The Non-I.I.D. Case
8.1. Independent non-identically distributed random variables
8.1.1. Maximal inequalities for weighted sums revisited
8.2. Martingales
8.3. Application to maximum likelihood
8.4. Examples
8.5. Notes
8.6. Problems and complements
9 Rates of Convergence for Least Squares Estimators
9.1. Sub-Gaussian errors
9.2. Errors with exponential tails
9.3. Examples
9.4. Notes
9.5. Problems and complements
10 Penalties and Sieves
10.1. Penalized least squares
10.2. Penalized maximum likelihood
10.2.1. Roughness penalty on the density
10.2.2. Roughness penalty on the log-density
10.3. Least squares on sieves
10.4. Maximum likelihood on sieves
10.5. Notes
10.6. Problems and complements
11 Some Applications to Semiparametric Models
11.1. Partial linear models
11.2. Mixture models
11.2.1. Introduction
11.3. A single-indexed model with binary explanatory variable
11.4. Notes
11.5. Problems and complements
12 M-Estimators
12.1. Introduction
12.2. Estimating a regression function using a general loss function
12.3. Classes of functions indexed by a finite-dimensional parameter
12.3.1. Least squares
12.3.2. Maximum likelihood
12.3.3. Asymptotic normality
12.4. Notes
12.5. Problems and complements
Appendix
References
Symbol Index
Author Index
Subject Index

Empirical Processes in M-Estimation (Cambridge Series in Statistical and Probabilistic Mathematics, Series Number 6) [Illustrated]
 0521123259, 9780521123259

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Recommend Papers