Theorem 1 (Wasserman (2004, p. 52 Theorem 3.17)) Let \(X_1, \ldots, X_n\) be random variables. Define \[\overline{X}_n = \frac{1}{n}\sum_{i=1}^n X_i,\ S_n^2=\frac{1}{n-1}\sum_{i=1}^n \left(X_i-\overline{X}_n\right)^2\] as the sample mean and sample variance, respectively. If \(X_1, \ldots, X_n\) happen to be independent and identically distributed with \(\mu=\mathbb{E}\left(X_i\right)\) and \(\sigma^2=\mathsf{Var}\left(X_i\right)\), then, \[\mathbb{E}\left(\overline{X}_n\right)=\mu,\ \mathsf{Var}\left(\overline{X}_n\right)=\frac{\sigma^2}{n}, \ \mathbb{E}\left(S^2_n\right)=\sigma^2\]