# Expected value

long-run average value of a random variable

In probability theory and statistics, the expected value of a random variable $X$ (represented by the symbol $E(X)$ ) is the average value the variable will take, that is, assuming that the experiment is repeated an infinite number of times, and the mean (or weighted average) of all the values is calculated along the way.

By definition, the expected value of a discrete random variable $X$ is calculated by the formula $\sum _{x}xP(X=x)$ , where $P(X=x)$ is the probability that $X$ equals $x$ , and $x$ ranges over all possible values of $X$ .

The Law of large numbers describes how this happens.

## Relationship to weighted average

It is possible to think of the expected value as a weighted average, where the weights, $w_{j},$  are equal to $P(X=x_{j})$ . The sum over all probabilities,$\sum _{j}P(X=x_{j}),$  equals one (1). This allows us to write the weighted average as:

${\frac {\sum _{j}w_{j}x_{j}}{\sum _{j}w_{j}}}=\sum _{x}xP(X=x)=E(X)$