convergence in probability

Show Summary Details

Quick Reference

A sequence of random variables x1,…, xn,… converges in probability to a random variable x if for every positive number ε the probability of the (Euclidean) distance between xn and x exceeding ε converges to zero as n tends to infinity, sofor every ε > 0. This means that, if we consider a sequence of probabilities, Pn = P[|xnx| ≥ ε], then starting from some n0 each probability in this sequence is arbitrarily small. In particular, x can be a constant. Convergence in probability implies convergence in distribution (the converse does not hold, in general).

Subjects: Economics.

Reference entries

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.