Disclaimer: The information on this page has not been checked by an independent person.   Use this information at your own risk.
ROYMECHX clone of ROYMECH

Click arrows to page adverts


Home
Statistics Index



Statistics Expectation



Introduction

The expectation E(X) is simply defined as the sum of the products of all values by the respective probability.  There expectation is valid for discrete and continuous distributions.  The Expectation is actually synonymous with the mean value.



Symbols

In the notes below the probability distributions relate to probability of successes.  In practice they could equally relate to failures, or outcomes with desired values ( dice throw = 6)

X = random variable
μ = population /distribution mean
E(X) = expected mean of random variable
Var(X) = variance of random variable
σ 2 = Distribution variance
σ = Distribution standard deviation
f(x) = probability density function
P(k) = probability that in that k successes occur in n trials
F(x) = probability distribution function. ***
xm = arithmetic mean of sample
sx 2 = variance of sample
sx = Standard deviation of sample
N = Number of items in distribution
n = number of trial/ sample items
M = Number of success items in options (= pN)
k = number of successes
p = probability of success 0 < p < 1
pi = probability of x = xi
xi = discrete value of random sample




Notes

Considering a typical probability distribution as shown below.   The expectation is effectively the horizontal position of the centroid of the area.  

Example 1:
It is clear that if four coins are tossed the expectation of heads resulting = 2.  This is confirmed as shown below

The probability of 0 heads = (1/2)4 = 1/16
The probability of 1 heads = 4.(1/2)4 = 1/4
The probability of 2 heads = 6.(1/2)4 = 1/4
The probability of 3 heads = 4.(1/2)4 = 1/4
The probability of 4 heads = (1/2)4 = 1/16

E(X) = 0.(1/16) + 1.(1/4) + 2(3/8) + 3(1/4) + 4(1/16) = 2

Example 2:
Expectation on tossing a fair dice.
E(X) = 1.(1/6) + 2.(1/6) + 3.(1/6)+4.(1/6)+ 5.(1/6)+ 6.(1/6) = 21/6 = 3,5 ( Half way between 3 and four)

Properties of Expected values

1) E(aX) = a.E(X)

2) E(aX + b) = aE(X) + b

3) Var(X) = E(X - μ 2 ) = E(X 2) - μ 2 = E(X 2) - [E(X )]2

4) E(X+Y) = E(X) + E(Y) ... X and Y are two random variables.

5) E(X.Y) = E(X).E(Y) ...X and Y are two independent variables

6) Var(cX) = c2 Var(X)

7) Var(X +Y) = Var(X) + Var(Y).... X and Y are two independent variables

8) Var(X +c) = Var(X)

Proof of 2) E(aX + b) = aE(X) +b

Proof of 3) Var(X) = E(X)2 - μ 2

Proof of 6) Var(cX) = c2) Var(X)

Useful Related Links
  1. MathsRevision ..Expectation ...Clear tutorial notes
  2. Stanford University Tutorial Notes .... Quite detailed notes.
  3. Concepts in probability theory - Mathematical Expectation .... A very good description of the concept of expectation

This Page is being developed

Home
Statistics Index

Send Comments to Roy Beardmore

Last Updated 11/10/2007