0% found this document useful (0 votes)
176 views9 pages

Information Capacity Theorem

The document discusses that when two independent Gaussian random variables are added together, the resulting random variable will also be Gaussian. The variance of the summed Gaussian random variable is equal to the sum of the original variances. Therefore, the document presents how the variance behaves when independent Gaussian random variables are added.

Uploaded by

Abhishek Bose
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
176 views9 pages

Information Capacity Theorem

The document discusses that when two independent Gaussian random variables are added together, the resulting random variable will also be Gaussian. The variance of the summed Gaussian random variable is equal to the sum of the original variances. Therefore, the document presents how the variance behaves when independent Gaussian random variables are added.

Uploaded by

Abhishek Bose
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 9

We know that if two independent Gaussian random variables are added , the variance of the resulting Gaussian random

variable is the sum of the variances. Therefore,

You might also like