# Expected value

long-run average value of a random variable

In probability theory and statistics, the expected value of a random variable ${\displaystyle X}$ (represented by the symbol ${\displaystyle E(X)}$[1]) is the average value the variable will take, that is, assuming that the experiment is repeated an infinite number of times, and the mean (or weighted average) of all the values is calculated along the way.[2]

Rolling a dice, the mean converges to the expected value of 3.5

By definition, the expected value of a discrete random variable ${\displaystyle X}$ is calculated by the formula ${\displaystyle \sum _{x}xP(X=x)}$, where ${\displaystyle P(X=x)}$ is the probability that ${\displaystyle X}$ equals ${\displaystyle x}$, and ${\displaystyle x}$ ranges over all possible values of ${\displaystyle X}$.[3]

The Law of large numbers describes how this happens.

## Relationship to weighted average

It is possible to think of the expected value as a weighted average, where the weights, ${\displaystyle w_{j},}$  are equal to ${\displaystyle P(X=x_{j})}$ . The sum over all probabilities,${\displaystyle \sum _{j}P(X=x_{j}),}$  equals one (1). This allows us to write the weighted average as:

${\displaystyle {\frac {\sum _{j}w_{j}x_{j}}{\sum _{j}w_{j}}}=\sum _{x}xP(X=x)=E(X)}$

## References

1. "List of Probability and Statistics Symbols". Math Vault. 2020-04-26. Retrieved 2020-08-21.
2. "Expected Value - easily explained! | Data Basecamp". 2021-11-26. Retrieved 2022-07-15.
3. "Expected Value | Brilliant Math & Science Wiki". brilliant.org. Retrieved 2020-08-21.