# Shannon's Law

The Shannon-Hartley Capacity Theorem, more commonly known as the Shannon-Hartley theorem or Shannon's Law, relates the system capacity of a channel with the averaged recieved signal power, the average noise power and the bandwidth.

This capacity relationship can be stated as:

${\displaystyle C=Wlog_{2}\left(1+{S \over N}\right)}$

where:

C is the capacity of the channel (bits/s)

S is the average received signal power

N is the average noise power

W is the bandwidth (Hertz)

Shannon's work showed that the values of S,N, and W set a limit upon the transmission rate. [1]

It is important to note that doubling the bandwidth will NOT double the available capacity. The parameter N, is often defined as the average noise power in an AWGN (Additive White Gaussian Noise) system, is dependent upon bandwidth. In order to double the capacity, the argument within the logarithm, "1 + signal to noise ratio" must be squared.

${\displaystyle C=Wlog_{2}\left(\left(1+{S \over N}\right)^{2}\right)}$
${\displaystyle C=2Wlog_{2}\left(1+{S \over N}\right)}$

## References

[1] Shannon, C.E., "Communication in the presence of Noise," Proc. IRE, vol.37, no. 1, January 1949, pp. 10-21