Definition (Convergence in Probability). The random variable converges in probability to a constant if
If converges in probability to , then we write
Theorem (Convergence in Quadratic Mean). If has mean and variance such that the ordinary limits of and are and , respectively, converges in mean square to , and
Definition (Consistent Estimator). An estimator of a parameter is a consistent estimator of if and only if
Theorem (Consistency of the Sample Mean). The mean of a random sample from any population with finite mean and finite variance is a consistent estimator of .
Theorem (Lindeberg-Levy Central Limit Theorem (Univariate)).
If are a random sample from a probability distribution with finite mean \mu and finite variance and , then
Definition (Asymptotic Distribution). An asymptotic distribution is a distribution that is used to approximate the true finite sample distribution of a random variable.
If , then approximately, or asymptotically, , which we write as