📖Glossary📖
Here is your desired excerpt from our glossary
Expected value : The expected value $E[X]$ of a random variable $X$ is defined by $E[X]=\sum P[X=x_i]x_i.$ For example the expected value of a random variable $X$ that describes a throw of a dice is $\frac{1}{6}(1+2+3+4+5+6)=3.5.$ For a continuous random variable $X$ with density $f$ the sum in the definition of expected value is replaced by the integral $E[X]=\int\limits_{-\infty}^{\infty} xf(x)dx.$ According to the law of large numbers the average value of a large number of independent repetitions of a random experiment described by $X$ converges to $E[X].$