Expected value

long-run average value of a random variable

In probability theory and statistics, the expected value of a random variable (represented by the symbol [1]) is the average value the variable will take, that is, assuming that the experiment is repeated an infinite number of times, and the mean (or weighted average) of all the values is calculated along the way.[2]

Rolling a dice, the mean converges to the expected value of 3.5

By definition, the expected value of a discrete random variable is calculated by the formula , where is the probability that equals , and ranges over all possible values of .[3]

The Law of large numbers describes how this happens.

Relationship to weighted average change

It is possible to think of the expected value as a weighted average, where the weights,   are equal to  . The sum over all probabilities,  equals one ( ). This allows us to write the weighted average as:


Related pages change

References change

  1. "List of Probability and Statistics Symbols". Math Vault. 2020-04-26. Retrieved 2020-08-21.
  2. "Expected Value - easily explained! | Data Basecamp". 2021-11-26. Retrieved 2022-07-15.
  3. "Expected Value | Brilliant Math & Science Wiki". brilliant.org. Retrieved 2020-08-21.