# Is there a minimum amount of power required to sustain a given bandwidth?November 19, 2007 8:38 AM   Subscribe

What, if any, is the relationship between bandwidth and power?

Is there any fundamental physics relationship between the bandwidth of data transfer (i.e., number of data per time) and the minimum power (i.e., watts) required to sustain such a rate over a given amount of time?

Obviously there are macro-scale power requirements for something like the internet, but that's not what my question is about. I'm wondering about the theoretical limit, if there is one. It seems like there would have to be some minimum amount of power required to move a bit of information from one place to another. Are there papers or research I can read on this topic?
posted by odinsdream to science & nature (14 answers total) 6 users marked this as a favorite

Hmm. Been a long time since I had enough physics to be confident of this, but I can't imagine there being a real minimum. Any action that can be detected is information; a single photon can be treated as a single bit... though the answer might also be "Planck's constant." Hopefully someone with better (read: any) knowledge of quanta can provide a more substantial response.
posted by Tomorrowful at 8:48 AM on November 19, 2007

This is the place to start reading while you're waiting for someone a lot smarter than me to come along and explain this stuff in layman's terms. I think channel capacity might be what you want. The more noise in the channel, the more power required.
posted by Leon at 8:50 AM on November 19, 2007

The minimum power needed is related to the maximum ability to discern the signal from noise.

Meaning, the better your signal detection, the less power you need to use.

Practically, our ability to detect information signals is quite good if we put enough resources against it. Our radio antennas can receive signals from the Voyager spacecraft over 15 billion kilometers away as it broadcasts with basically a night light as it's transmitter.
posted by Argyle at 8:53 AM on November 19, 2007

That's a question that Claude Shannon asked, and answered, with when he formulated "Information Theory". Information is energy, and to the extent that information is ordered, then transmission losses are a form of entropy. Thus all communications are subject to the Second Law of Thermodynamics.

It turns out that there is indeed an inherent amount of energy in each payload bit, but it's a hell of a lot smaller than you might think. It's well below the amount of power we actually use per payload bit in any existing communications medium.

To prevent communication losses, you either have to use more energy, or you have to use more bandwidth. That latter is the basis for DSS, Direct-Sequence Spread Spectrum. It introduces the idea of a "chip", a transmitted bit which contains just a piece of a payload data bit. If you use lots of chips per bit, and handle everything else correctly, you get what's called "coding gain", an improvement in signal/noise that comes without requiring increased transmission power.

That's because DSS, handled properly, can sustain huge transmission losses in chip terms and still reconstruct the payload bits properly. If 49% of the chips are trashed but 51% get through properly, then reproduction of the data is 100% correct.
posted by Steven C. Den Beste at 8:59 AM on November 19, 2007

The transmitter on Voyager 2 was about 30 watts, but it used a dish and aimed the dish at the inner solar system. When Voyager 2 reached Neptune, it turned out there was only one radio receiver on Earth sensitive enough to receive the signal: the VLA. Voyager 2 was reprogrammed to make it transmit so that its signal would arrive at Earth during periods when the VLA could see it. They also reprogrammed it to use something similar to DSS which would increase its coding gain.
posted by Steven C. Den Beste at 9:03 AM on November 19, 2007

I vaguely remember reading a factoid that said that when living cells copy DNA during cell division, that they are within an order of magnitude of the theoretical limit of energy expenditure-per-bit.

No human-designed system is remotely that good, however.
posted by Steven C. Den Beste at 9:10 AM on November 19, 2007

SCDB: I thought Voyager 2 was still received by the Deep Space Network. It looks like they used the VLA experimentally in the 80s, though.
posted by zsazsa at 9:19 AM on November 19, 2007

ZsaZsa, now that it's away from Neptune and the amount of data it's sending is much lower per unit time, they can increase the chip-per-bit rate even further. But during the Neptune encounter, they wanted a faster data stream than that, which meant a lower chip/bit ratio and less coding gain. The Deep Space Network couldn't hack it; they had to use the VLA.
posted by Steven C. Den Beste at 9:51 AM on November 19, 2007

You're looking for the Shannon-Hartley Law, which gives the maximum channel capacity in bits per second given the bandwidth and signal-to-noise ratio. The "S" in this equation is the signal power.
posted by Wet Spot at 10:38 AM on November 19, 2007

Note that you have to distinguish between the capacity of the channel, in bits per second, and the bandwidth of the channel, which is conventionally defined as the range of frequencies which can be carried without attenuation (within 3 dB, the half-power point).
posted by Wet Spot at 10:41 AM on November 19, 2007

There are formulations of Shannon's bound for a channel's capacity (C) which are stated explictly in terms of the bandwidth (B) signal-to-noise power ratios (S/N).

C = B log_2 (1 + S/N)

The capacity of a channel (the maximum rate at which it is possible to send information reliably) depends on the bandwidth of the channel, as well as the signal-to-noise power ratio at the receiver.

This bandwidth is a physical bandwidth, so to speak. It refers to the size of the range of frequencies of signals sent by the transmitter that can arrive at the receiver of the channel. In some cases, like wireless communications, this ends up being limited by licensing (the range of frequencies that one is allowed to transmit in).

The "bandwidth" that is usually quoted on modems/cell phones/etc actually refers to the channel capacity, the amount of data that can be transfered reliably per unit of time.

Note that Shannon's theorem gives a non-constructive bound. The capacity that is actually achievable for a give bandwidth and SNR depends on what error correction code you can construct, and how well it works. These days, however, there are some very good codes that get very close to the Shannon bound.
posted by TheyCallItPeace at 10:43 AM on November 19, 2007 [1 favorite]

(Yes, when reading information theory stuff it's important to remember that the IT/tech industry misuses a number of terms from other fields, and one of those terms is "bandwidth", used in IT to mean data rate, used elsewhere to mean the width of the frequency band containing the signal. On the other hand, for a really rough estimate, you can do worse than to estimate 1 bit-per-second per Hz of bandwidth.)
posted by hattifattener at 3:12 PM on November 19, 2007

On the other hand, for a really rough estimate, you can do worse than to estimate 1 bit-per-second per Hz of bandwidth.

Depends on what you mean by rough. From Shannon's Theorem that TheyCallItPeace cites above, an important factor is the signal to noise ratio. This noise ratio means that you can have different voltage amplitude levels representing different symbols on the same wave. The signal to noise ratio in Shannon's Theorem determines the number of discrete voltage levels that can be detected, which is
Square root (1 + S/N).

For example the the voice band of plain old telephone lines is only about 3100 Hz, but the signal to noise ratio might be up to about 50dB, resulting in a theoretical limit of about 45Kb/s. This was used for the V.34 33.6K modems, a bit rate that is more than 10 times the bandwidth.
posted by JackFlash at 6:30 PM on November 19, 2007

I think all the information theoretic limits are based on SNR, so you also need to quantify noise to answer the question. Circuit elements (resistors, transistors, whatever) have fundamental minimum noise limits.

From Johns and Martin:
Thermal noise is due to the thermal excitation of charge carriers in a conductor. This noise has a white spectral density and is proportional to absolute temperature. It is not dependent on bias conditions (dc bias current) and it occurs in all resistors (including conductors) above absolute zero. Thus, thermal noise places fundamental limits on the dynamic range achievable in electronic circuits. [emphasis added, thermal noise aka Johnson or Nyquist noise].
It goes on to outline Shot noise and Flicker noise. Then, noise in resistors:
The major source of noise in resistors is thermal noise. As just discussed, it appears as white noise and can be modeled as a voltage source, VR(f), in series with a noiseless resistor. With such an approach, the spectral density function, V2R(f), is found to be given by
V2R(f) = 4kTR
where k is Boltzmann's constant, T is the temperature in Kelvins, and R is the resistance size.
So.. That isn't a hard limit, because it varies with temperature and resistance value, but it adds some perspective.. There are actually circuits that are kept very cold, just to reduce thermal noise. Those are pretty extraordinary circuits though..

Another interesting way to look at noise is the background electromagnetic noise in the atmosphere.
posted by Chuckles at 9:19 PM on November 19, 2007

« Older My partner is currently applyi...   |  Where Can People Send Help to ... Newer »