The document discusses that when two independent Gaussian random variables are added together, the resulting random variable will also be Gaussian. The variance of the summed Gaussian random variable is equal to the sum of the original variances. Therefore, the document presents how the variance behaves when independent Gaussian random variables are added.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
176 views9 pages
Information Capacity Theorem
The document discusses that when two independent Gaussian random variables are added together, the resulting random variable will also be Gaussian. The variance of the summed Gaussian random variable is equal to the sum of the original variances. Therefore, the document presents how the variance behaves when independent Gaussian random variables are added.