variance of a sum of iid random variables

For the discussion of math. Duh.

Moderators: gmalivuk, Moderators General, Prelates

>-)
Posts: 508
Joined: Tue Apr 24, 2012 1:10 am UTC

variance of a sum of iid random variables

Postby >-) » Tue Jan 31, 2017 4:04 am UTC

consider y_1 to y_n as i.i.d. random variables with mean u, and let X = sum 1 to n of y_i
then E[X]^2 = (nu)^2
E[X^2] = E[(sum i = 1 to n of y_i)^2]
= E[sum i, j from 1,1 to n,n of y_i y_j]
= sum i,j from 1,1 to n,n of E[y_i y_j] (by linearity of expectation)
= sum i,j from 1,1 to n,n of E[y_i] E[y_j] (by independence of y_i)
= n^2 u^2 (there are n^2 terms, each with value u^2)

so Var[X] = E[X^2] - E[X]^2 = n^2 u^2 - (nu)^2 = 0

but nowhere have we used the fact y_i must have 0 variance. if y_i has nonzero variance, something is clearly wrong here, but i cannot find it!

i devised this conundrum while solving a homework exercise. i discarded this answer and came up with the correct one, but cannot figure out what is wrong with this "proof".

Cauchy
Posts: 599
Joined: Wed Mar 28, 2007 1:43 pm UTC

Re: variance of a sum of iid random variables

Postby Cauchy » Tue Jan 31, 2017 7:03 am UTC

>-) wrote:sum i,j from 1,1 to n,n of E[y_i y_j]
= sum i,j from 1,1 to n,n of E[y_i] E[y_j] (by independence of y_i)


This is not true, because, for example, y_1 and y_1 are not independent. You end up with n(n-1) terms of the form E[y_i] E[y_j] for i != j, and n terms of the form E[y_i^2].
(∫|p|2)(∫|q|2) ≥ (∫|pq|)2
Thanks, skeptical scientist, for knowing symbols and giving them to me.

>-)
Posts: 508
Joined: Tue Apr 24, 2012 1:10 am UTC

Re: variance of a sum of iid random variables

Postby >-) » Tue Jan 31, 2017 1:15 pm UTC

ah, nice spot, I'd somehow forgotten about those terms


Return to “Mathematics”

Who is online

Users browsing this forum: No registered users and 9 guests