Integration of small cells into the “macro” network could be more effective than even Wi-Fi offload in solving the problem of data traffic outstripping increases in available spectrum.
Just about everyone in the wireless industry is predicting continued explosive growth in traffic on wireless data networks. It is reasonable to expect that in the future the struggle to keep up will play out mainly in 4G LTE networks, the first of which are just now beginning to see significant loading.
I have been using a 4G smartphone for a couple of months now, and for the most part performance on the Verizon LTE network has been great, with download speeds reliably testing (mostly in and around Seattle) at 6 Mbps or higher. However, in New Orleans during the recent CTIA Wireless convention my speeds were often much lower—sometimes dipping below 1 Mbps. Of course, New Orleans was for those few days inundated with industry types toting and using the latest 4G gadgets, so it is reasonable to expect that the local Verizon network would be strained. On the other hand, it’s also reasonable to assume that Verizon anticipated this and went to some trouble to tune the network for peak performance.
I suspect that my New Orleans experience presages the future of LTE development and performance. Network refinement, in terms of stuff like tweaking antenna orientations and power levels, will likely not keep up with growth in peak traffic. Anticipated increases in available spectrum will help, but not enough over the long run. Instead, we are going to have to rely on evolutionary advances in technology, particularly in the LTE air interface, to allow more efficient use of the spectrum we have. Indeed, the evolution of basic LTE radio access network (RAN) standards has been paralleled by hopeful development of a number of ideas that, at least in computer simulations, appear to hold a good deal of promise. Unfortunately some of these, like MIMO antenna systems and adaptive antenna beamforming, will probably be much less effective in the real world. But one relatively new idea, so-called “heterogeneous networks” or “HetNets,” appears to offer substantial hope for real increases in spectrum efficiency. And it’s not just me that thinks so—most of the major infrastructure vendors at the CTIA show were positively gushing about their HetNet developments. Also, 3GPP, the international body responsible for LTE standards, has an active working group looking into HetNet development.
So what exactly is a HetNet? Various industry insiders might provide somewhat different answers, but essentially the idea is a wireless network composed of both large “macro” cells that provide generally ubiquitous regional coverage, and much smaller cells that provide focused capacity in confined areas of higher traffic density. By some definitions, the small cells could be served by a different air interface technology, most commonly Wi-Fi, but most current HetNet development activity seems to be focused on the use of LTE microcells, picocells, and femtocells.
On the surface, the idea of HetNets may not seem all that innovative. After all, most 4G smartphones provide Internet connectivity via W-Fi networks as well as 4G, and microcells, picocells, and even femtocells have been used for years in 2G and 3G networks. What is new in the latest HetNet developments is true integration of the much smaller cells into the “macro” network. This integration serves two important purposes. First, from the user’s perspective it provides seamless operation, including transparent delivery of features and, in the case of VoIP and VoLTE calls, interruption-free handoffs between large and small cells. More critically, optimized integration will ideally allow the small cells to absorb substantial traffic loads without diminishing capacity of the macro network.
For some time, network operators have assumed, or at least hoped, that Wi-Fi “hotspots” would provide capacity relief in areas of high traffic density like airport departure lounges, hotels, and shopping malls. Unfortunately, 802.121-based Wi-Fi systems, which operate on limited bands of unlicensed spectrum, are often more severely loaded than the 3G and 4G networks they relieve. I have generally found that the public WI-Fi service available in hotels and airport terminals provides much lower speeds than available 4G and even 3G networks. Adding more Wi-Fi access points in the same area doesn’t help because the limiting factor is usually mutual interference. AT&T might put a hotspot for exclusive use of its 3G customers in Terminal B at O’Hare, but that access point and its users will have to compete for spectrum with the other ten or so operating in the same general area.
With Wi-Fi proving to not be a totally reliable answer to congestion relief, 3GPP and network operators are putting more focus on HetNets that incorporate LTE microcells, picocells and femtocells. Microcells typically provide capacity similar to that of a macrocell but transmit at much lower power levels. They may provide service to outdoor areas like a sports stadium or might cover an office building or hotel through the use of a distributed antenna system (DAS). Picocells and femtocells generally provide limited capacity with very low transmit power, and are intended to serve much smaller, usually indoor, areas.
At the service level, HetNets present some interesting challenges such as macrocell-to-picocell handoffs. But what will most likely determine the ultimate effectiveness of HetNets is their impact on overall network spectrum efficiency. If RF interference from small cells significantly reduces the capacity of overlying macrocells very little will be gained. In fact, it’s generally recognized that development of interference management tools is at the heart of LTE HetNet evolution. So much so, in fact, that the current leading contender for HetNet interference management, called Enhanced Inter-Cell Interference Control (EICIC) is the subject of numerous scholarly studies and papers.
The basic concept behind EICIC is actually pretty simple, but to understand how it works (or perhaps why it may not work all that well) it’s useful to trace the development of interference management strategies for LTE as a whole. LTE’s downlink channel, where capacity is most constrained, is structured mainly as some number of OFDM subcarrier groups. Each group contains 12 subcarriers, and the number of groups in a network is dependent on the available spectrum bandwidth. Way back when LTE was first being developed it was thought that each cell in a network could transmit on all available subcarriers without coordination – in other words, a frequency reuse factor of one. I have no idea why anybody really believed that this would work, since co-frequency interference would obviously be horrendous. More recently came the strategy of Inter-Cell Interference Control (ICIC) in which more-or-less conventional frequency planning would be used, but only for user devices operating at the “cell edge.” With HetNets, it is recognized that a small cell could be located anywhere within the coverage of the overlying cell. To avoid mutual interference, a system of time-domain allocations is added. (That’s the “enhanced” part of EICIC.) The macrocell and small cell may transmit on the same subcarrier groups, but not simultaneously.
Obviously, if you carve out time domain portions of the subcarrier groups used by a macrocell its capacity will be reduced. The hope is that the carved out time can be used by multiple isolated small cells within the macrocell’s coverage, thus increasing overall network capacity.
I have substantial doubts about the effectiveness of using path loss partitioning as the basis for interference management as called for in ICIC. (See “The Myth of the ‘Cell Edge’,” Wireless Week Dec. 3, 2011.) I suspect that the time domain component added in EICIC for HetNets will work better, but overall the interference management solutions heretofore developed for LTE are, in my opinion, surprisingly weak. Unfortunately, that weakness could substantially limit the capacity benefits that might otherwise derive from HetNet architectures.
So, will HetNets be the answer to the problem of growth in traffic outstripping increases in available spectrum? Right now I’d say they represent the best hope among available technologies. But a lot will depend upon how much effort is put into developing enabling interference management techniques. In that regard, we’ve got a ways to go.