LTE Unlicensed – Operator Spectrum Grab?

0
12

I recently attended a FierceLive! webinar presented by Qualcomm on LTE Unlicensed (LTE-U), also known as License-Assisted Access (LAA) LTE.  For me, this presentation raised some troubling questions.

First some background information.  There is no 3GPP standard on LAA-U as of today.  Maximum TX power in the U-NNI-1 band (5.15 to 5.25 GHz), the U-NNI-2 band (5.25 to 5.35 GHz) and the U-NNI-2E band (5.470 to 5.725 GHz) of 30 dBm, and maximum TX power in the U-NNI-3 / ISM band (5.725 to 5.825 GHz) of 36 dBm implies that LTE-U is best suited for eNodeB small cells, not for higher TX power eNB macrocell base stations.  Industry proposals for LTE-U include carrier aggregation schemes with one or more licensed carrier aggregated with one or more unlicensed (5 GHz) carrier (or Supplemental Downlink), with all control channels on the licensed spectrum LTE carrier (hence the term “license assisted”).  Under these proposals, operators would deploy their own dual-connectivity LTE-U “hot spots” (small cell eNB access points), with the equivalent of RAN infrastructure sharing being prohibited.  Wi-Fi proponents globally are concerned about the impact of LAA-U, primarily because Wi-Fi utilizes a polite “Listen-Before-Talk” (LBT) clear channel assessment (CCA) scheme.

Qualcomm spent much of their presentation time on how their proposal for LTE-U will be a good neighbor to Wi-Fi services, using a coexistence protocol called Carrier Sensing Adaptive Transmission (CSAT).  In this scheme, LTE eNodeB base stations (typically small cells) share 5 GHz spectrum with Wi-Fi on a time division multiplexed schedule, based on the eNB’s dynamic assessment of available bandwidth slices and Wi-Fi traffic density.  They claimed their approach with CSAT makes LTE-U “a better neighbor to Wi-Fi than Wi-Fi”, a claim that’s kind of hard to swallow.  Qualcomm showed a video of an elaborate streaming video demo to underscore their Wi-Fi coexistence claims.  In fairness, the Qualcomm presenters acknowledged that the demo was downlink only, implying a highly asymmetric network architecture, which is not very practical in the real world.  In responding to a question, the Qualcomm presenters also admitted that in countries (like the US) where Dynamic Frequency Selection (DFS) is required in the U-NII-2 band (5.25 to 5.71 GHz), LTE-U would be constrained to operate in the U-NII-1 and U-NII-3 bands only.  This huge constrain flies in the face of Qualcomm’s claim (found in the footnotes of several of the slides presented today) that 500 MHz is available for LTE-U in the 5 GHz band.  U-NII-1 and U-NII-3 together only provide 200 MHz of spectrum guys!

From a technical perspective, how can LTE-U, using 20 MHz bandwidth channels and 64-QAM provide higher bandwidth services than 802.11ac, using 160 MHz channels and 256-QAM (Broadcom has recently introduced 1024-QAM)?  Please explain Qualcomm. Also, it seems LTE-U complicates the already extremely complicated LTE RF front-end design on the UE side, which would have to discriminate between 5 GHz band LTE RX signals (routed to the LTE radio) and 5 GHz band 802.11ac RX signals (routed to the Wi-Fi radio).  Tell us this ain’t so, Qualcomm.

What I find most concerning is Qualcomm’s claim of significant operator support for LTE-U using CSAT.  On the one hand, operators who have spent tens of billions of dollars for licensed spectrum access can be expected to strongly support a new standard that will protect their investment in LTE spectrum and infrastructure.  On the other hand, what party represents the public’s interest in ubiquitous Wi-Fi services?  If LTE-U is adopted by the 3GPP in Release 13, who will protect consumers if LTE-U has unintended consequences on legacy Wi-Fi services?  I am concerned that the operator profit motive could lead to a classic “tragedy of the commons” outcome for Wi-Fi in the future.

We will be publishing a research report in March that will address these and other important issues of LTE and Wi-Fi coexistence. More information at Wi-Fi Research.