consider y_1 to y_n as i.i.d. random variables with mean u, and let X = sum 1 to n of y_i

then E[X]^2 = (nu)^2

E[X^2] = E[(sum i = 1 to n of y_i)^2]

= E[sum i, j from 1,1 to n,n of y_i y_j]

= sum i,j from 1,1 to n,n of E[y_i y_j] (by linearity of expectation)

= sum i,j from 1,1 to n,n of E[y_i] E[y_j] (by independence of y_i)

= n^2 u^2 (there are n^2 terms, each with value u^2)

so Var[X] = E[X^2] - E[X]^2 = n^2 u^2 - (nu)^2 = 0

but nowhere have we used the fact y_i must have 0 variance. if y_i has nonzero variance, something is clearly wrong here, but i cannot find it!

i devised this conundrum while solving a homework exercise. i discarded this answer and came up with the correct one, but cannot figure out what is wrong with this "proof".

## variance of a sum of iid random variables

**Moderators:** gmalivuk, Moderators General, Prelates

### Re: variance of a sum of iid random variables

>-) wrote:sum i,j from 1,1 to n,n of E[y_i y_j]

= sum i,j from 1,1 to n,n of E[y_i] E[y_j] (by independence of y_i)

This is not true, because, for example, y_1 and y_1 are not independent. You end up with n(n-1) terms of the form E[y_i] E[y_j] for i != j, and n terms of the form E[y_i^2].

(∫|p|

Thanks, skeptical scientist, for knowing symbols and giving them to me.

^{2})(∫|q|^{2}) ≥ (∫|pq|)^{2}Thanks, skeptical scientist, for knowing symbols and giving them to me.

### Re: variance of a sum of iid random variables

ah, nice spot, I'd somehow forgotten about those terms

### Who is online

Users browsing this forum: No registered users and 3 guests