We’ve been talking at the office about wireless devices receiving video. I’ve been wondering: will cell phone data services keep getting faster? Or is 4G actually running up against the physical limits of the spectrum?
As it turns out, the Shannon entropy limit for these kinds of channels is roughly 6 bits/second per Hz. 4g radios are approaching 5 bps/Hz. Technologically, at least, we’re almost out of room. We can push about 50Mbps over a 4g link, given the allotted frequency space for wireless data. The only real improvement we can make is to repartition the spectrum.
If we allocated all the frequency space currently used for broadcast television to wireless data, and get, say, 20dB signal-to-noise, we can push about 1.2 Gbps over the air. Shared between all devices talking to that tower. That’s only a tad more than my desktop’s wired connection to the home network. On top of that, we can expect to lose a lot more to multipath, interference, and zone overlap.
Terrestrial TV usually operates on the scale of hundreds of miles, but returning a transmission from a handheld device would be basically infeasible. If we shrank the zones to, say, one square mile, and one in a hundred people are using a phone at any given time, we could apportion about 7.4Mbps per user in San Francisco.
The best solution I can think of is to shrink the zones. If you shrink the zone size to 250x250 meters, you can push 300Mbps. The other possibility is (at the cost of latency) relying on extremely low-power mesh networks.
The funny thing is that we’re used to transmission technology always getting better. Radio, black and white TV, color TV, HDTV; for the first time we’re hitting fundamental physical limits.
But don’t worry. 300Mbps is enough for anyone. :)
Do you think television waves would be too big to operate a such a small zone allowance, e.g. would some data be lost?