# 2.11.1 The sum of i.i.d. random variables

Suppose that $\mathbf z_1$,$\mathbf z_2$,,$\mathbf z_ N$ are i.i.d. (identically and independently distributed) random variables, discrete or continuous, each having a probability distribution with mean $\mu$ and variance $\sigma ^2$. Let us consider the two derived r.v., that is the sum

[ \mathbf S_ N=\mathbf z_1+\mathbf z_2+\dots +\mathbf z_ N ]

and the average

$$\label{eq-averagez} \tag{2.9.57} \bar{\mathbf z}=\frac{\mathbf z_1+\mathbf z_2+\dots +\mathbf z_ N}{N}$$

The following relations hold

\begin{align} E[\mathbf S_ N] &= N \mu , \qquad \text{Var}\left[\mathbf S_ N \right]=N \sigma ^2 \label{eq-averagez2} \tag{2.9.58} \\ E[\bar{\mathbf z}] &= \mu , \qquad \text {Var}\left[\bar{\mathbf z} \right]=\frac{\sigma ^2}{N} \end{align}

An illustration of these relations by simulation can be obtained by running the following R script.

R code
# sum_rv.R
# Script: shows the relation between the variance
# of z and the variance of z.bar=(sum of N i.i.d. random variables)

rm(list=ls())
R<-10000 #number of realizations of each variable
N<-1000 # number odf summed variables
sdev<-10
mu<-1
z<-rnorm(R,mean=mu,sd=sdev) # D is uniformly distributed
print(var(z))
hist(z,main=paste("single r.v.: mean=",mu," variance=",sdev^2)) # see the shape of z

z.bar<-rep(0,R)
for (n in 1:N){
z.bar<-z.bar+rnorm(R,mean=mu,sd=sdev)

}

print(var(z.bar)/var(z))
hist(z.bar,main=paste("Sum of ",N, " r.v.s (mu=",mu,",var=",sdev^2 ,"): variance ~", round(var(z.bar)))) # see the shape of z.bar

aver<-z.bar/N
hist(aver,
main=paste("Average of ",N, " r.v.s (mu=",mu,",var=",sdev^2 ,"): variance ~", round(var(aver),digits=1))) # see the shape of z.bar