Bitrate limits and the Shannon-Hartley theorem
In long-range communication and short-range communication, the goal is to maximize bitrate and distance within the constraints of spectrum and noise. The Shannon-Hartley theorem is composed of work from Claude Shannon of MIT in the 1940s (C. E. Shannon (1949/1998). The Mathematical Theory of Communication. Urbana, IL: University of Illinois Press) and Ralph Hartley from Bell Labs in the 1920s (R. V. L. Hartley (July 1928). "Transmission of Information" (PDF). Bell System Technical Journal). Foundational work was developed by Harry Nyquist, also of Bell Labs, who determined the maximum number of pulses (or bits) that could travel in a telegraph in a unit of time (H. Nyquist, Certain Topics in Telegraph Transmission Theory, in Transactions of the American Institute of Electrical Engineers, vol. 47, no. 2, pp. 617-644, April 1928).
Essentially, Nyquist developed the sampling limit that determines how much theoretical bandwidth one has at a given sample rate. This is called the Nyquist rate, and is shown in the following equation:
Here, fp is the pulse frequency and B is the bandwidth in Hertz. This states that the maximum bitrate is limited to twice the sampling rate. Looking at it another way, the equation identifies the minimum bitrate at which a finite bandwidth signal needs to be sampled to retain all information. Undersampling leads to aliasing effect and distortion.
Hartley then devised a way to quantify the information in what is called the line rate. The line rate can be thought of as bits per second (for example, Mbps). This is known as Hartley's law and is the precursor to Shannon's theorem. Hartley's law simply states the maximum number of distinguishable pulse amplitudes that can be transmitted reliably is limited by the dynamic range of the signal and the precision with which a receiver can accurately interpret each individual signal. Shown is Hartley's law in terms of M (number of unique pulse amplitude shapes) which is equivalent to the ratio of the number of voltage:
Converting the equation to a base-2 log gives us the line rate R:
If we combine this with the preceding Nyquist's rate, we get the maximum number of pulses that can be transmitted over a single channel of bandwidth B. Hartley, however, did not work out with precision; the value of M (number of distinct pulses) could be affected by noise.
Shannon reinforced Hartley's equation by considering the effects of Gaussian noise and completes Hartley's equation with a signal-to-noise ratio. Shannon also introduced the concept of error correction coding instead of using individually distinguishable pulse amplitudes. This equation is affectionately known as the Shannon-Hartley theorem:
Here, C is the channel capacity in bits per second, B is the channel bandwidth in Hertz, S is the average received signal measured in watts, and N is the average noise on the channel measured in watts. The effect of this equation is subtle but important. For every decibel level increase in noise to a signal, capacity drops precipitously. Likewise, improving the signal-to-noise ratio will increase capacity. Without any noise, the capacity would be infinite.
It is also possible to improve the Shannon-Hartley theorem by adding a multiplier n to the equation. Here, n represents additional antennas or pipes. We have reviewed this previously as multiple input, multiple output (MIMO) technology.
To understand how Shannon's rule applies to the limits of wireless systems mentioned in this book, we need to express the equation in terms of energy per bit rather than the signal-to-noise ratio (SNR). A useful example in practice is to determine the minimum SNR needed to achieve a certain bitrate. For example, if we want to transmit C=200 kbps over a channel with a bandwidth capacity of B=5000 kbps, then the minimum SNR required is given as:
This shows that it is possible to transmit data using a signal that is weaker than the background noise.
There is a limit to data rate, however. To show the effect, let Eb represent the energy of a single bit of data in joules. Let No represent the noise spectral density in watts/hertz. Eb/No is a dimensionless unit (however usually expressed in dB) that represents the SNR per bit, or commonly known as the power efficiency. Power efficiency expressions remove biases of modulation techniques, error coding, and signal bandwidth from the equation. We assume that the system is perfect and ideal such that the RB=C where R is the throughput. The Shannon-Hartley theorem can be rewritten as:
This is known as the Shannon limit for an Additive White Gaussian Noise (AWGN). An AWGN is a channel and is simply a basic form of noise used commonly in information theory to express the effects of random processes in nature. These noise sources are always present in nature and include things such as thermal vibrations, black body radiation, and the residual effects of the Big Bang. The "white" aspect of noise implies equal amounts of noise are added to each frequency.
The limit can be drawn on a graph showing spectral efficiency versus SNR per bit:
Regions of interest in the figure include the R>B "Impossible Region". This region is above the Shannon limit of the curve. It states that no reliable form of information exchange can be above the limit line. The region below the Shannon limit is called the "Realizable Region" and is where R<B. Every protocol and modulation technique in any form of communication attempts to approach as close as possible to the Shannon limit. We can see where typical 4G-LTE using various modulation forms exist.
There are two other regions of interest. A "Bandwidth Limited" region toward the upper-right allows for high spectral efficiency and good Eb/No SNR values. The only constraint in this space is trading off a fixed or mandated spectral efficiency against the unconstrained power of transmission P, meaning the capacity has grown significantly over the available bandwidth. The opposite effect is called the power limited region towards the bottom-left of the chart. The Power Limited region is where the Eb/No SNR is very low, therefore Shannon's limit forces us down to low values of spectral efficiency. Here, we sacrifice spectral efficiency to get a given transmission quality P.
The chart also shows some typical modulation schemes used today such as phase shifting, QAM, and others. The Shannon limit also shows that arbitrarily improving a modulation technique such as quadrature amplitude modulation for 4-QAM to 64-QAM doesn't scale linearly. The benefit of higher orders of modulation (for example, 4-QAM versus 64-QAM) is the fact that you can transmit more bits per symbol (two versus six). The main disadvantage with higher orders of modulation are:
- Using higher order modulations requires ever greater SNR to operate.
- Higher orders of modulation require much more sophisticated circuitry and DSP algorithms contributing to complexity
- Increasing the transfer of bits per symbol will increase the error rate