Managing channel quality and individual
user throughput expectations are the biggest hurdles.
While enterprises in almost every other business sector are tightening their belts, major wireless carriers are planning for huge expenditures on 4G network deployments. It is hard not to be very excited that our industry is providing much needed corporate enthusiasm and capital investment when both are in such short supply. But it is important that we do not allow optimism for a brilliant future fueled by 4G to distract us from recognizing that getting there will involve significant technical challenges.
In my opinion, the two biggest hurdles that will be faced in delivering on the promise of 4G are managing channel quality and individual user throughput expectations. A brief look at these issues may provide some indication of the difficulties likely to be faced as 4G networks evolve.
There is a lot of talk about how OFDM will provide very high broadband speeds on 4G wireless networks, but the truth is that the data throughput rate on a channel of given RF bandwidth is limited by channel quality, regardless of channel structure and coding. In urban areas where most of us will be using 4G services, channel quality is generally determined by levels of interference from other users of the same RF channel. As the channel is used more intensively within a given geographic area, interference levels rise. Indeed, managing mutual interference among users within a wireless network is the fundamental task in network design and optimization.
For some perspective on the interference situation, we can look at the case for wireless voice services in dense urban areas. In such environments, today’s GSM and CDMA networks typically operate at effective carrier-to-interference ratios (“C/I”) of perhaps 7 to 10 dB. That means that at the receiver, the desired signal is 5 to 10 times stronger than the aggregate strength of all interfering signals within the radio channel being received (taking into account processing gain in the case of CDMA).
At C/I = 10 dB, the maximum theoretical error-free data throughput on an RF channel is about 3.5 bits per second per Hz. That means that a 5 MHz RF channel could theoretically deliver a maximum throughput of about 17.5 Mbps. Unfortunately, in the real world, radio systems never come close to achieving the theoretical maximum throughput performance. Imperfections in transmitter and receiver hardware, and in channel coding, typically extract a penalty equivalent to about a 5 dB rise in interference. At a C/I of 5 dB (rather than 10 dB), the theoretical maximum throughput rate on a 5 MHz channel would be about 10 Mbps. Take away perhaps 30 percent from that number to account for data overheads such as channel management, and the actual traffic throughput on the channel may look more like about 7 Mbps.
Expectations for 4G throughput are much higher and are based on the use of technologies, like multiple input/multiple output (MIMO) antennas that effectively allow more simultaneous use of the same RF channel in close proximity without a corresponding increase in mutual interference. The problem is that nobody really knows how well 4G enabling technologies such as MIMO will work in the real world of extremely intensive, high density wireless mobile networks. Most likely, achieving anticipated 4G channel performance will take years of refinement in interference management techniques.
The second key challenge for 4G is related to the fact that a wireless data channel is a shared resource. Whatever throughput it delivers has to be shared by all simultaneous users of that channel. This fact is often glossed over in discussions of spectacular 4G bandwidths, but in my opinion it is really the elephant in the room when it comes to long-term prospects for 4G.
A major problem in distinguishing between channel and individual throughput rates is typical usage patterns for Internet access have dramatically changed in the past few years and are still evolving rapidly. Not long ago, the most popular Internet applications (in terms of total demand) were “Web-surfing” and e-mail. High bandwidth certainly enhances user experience for these sorts of activities, but on average, throughput is quite modest. This characteristic of high peak, moderate average user throughput demand is ideal for shared channels because it allows substantial numbers of simultaneous users to be served with satisfactory perceived speeds.
Unfortunately, the recent trend in “typical” Internet usage is toward applications for which high average throughput is either essential (e.g. streaming video) or directly impacts perception of performance (e.g. peer-to-peer file sharing). Indeed, by some accounts, such high average bandwidth applications now account for well over half of all Internet traffic and could easily go to 80 percent in the next few years. If 4G network usage follows the same trend, the only way to deliver satisfactory performance will be to dramatically lower the number of users per channel, which will send costs skyrocketing and create a voracious appetite for spectrum in urban areas. Unless average per-user throughput is somehow constrained, any business strategy that positions 4G simply as an alternative to wire- and fiber-borne broadband services very likely will fail.
Perfecting interference management technologies and implementing user-friendly ways to limit average throughput demand won’t be sufficient to assure success for 4G, but they will almost certainly be necessary. If network operators aren’t already addressing these challenges, they should be.
Drucker is president of Drucker Associates.
He may be contacted at edrucker@drucker-associates.com.