Radio Rights and Wireless Technical Innovation
PDF Version (long version PDF)Michael J. Marcus, Sc. D., F-IEEE
For better or worse, wireless technology is one of the most regulated technologies in our modern world. In classic wireless regulation, such as was in place in the US prior to the mid 1970s, almost all innovative technologies needed positive regulatory action before they could be used in operational systems. The ultimate goal of a radio rights regime is to stimulate capital formation for wireless technical innovation while at the same time providing enough certainty for the capital formation of incumbent spectrum users to build out and operate their systems. While these goals are not contradictory at their root, they certainly partially conflict. This conflict explains the difficulty regulators have had in devising such a regime in a way that pleases all parties involved. However, this goal is so important in our information age society and economy that it is vital to move forward towards it no matter how difficult.
A radio rights regime must consider the following technical issues:
I/S Protection at Receiver
Fortunately, the issue of how much protection a receiver needs has gotten much simpler in the today's digital age. NTIA's survey study found "(o)ne common feature was that for continuous, long-term interfering signal levels, nearly all established IPC were based on an interference-to-noise power ratio of -6 to -10 dB".[1] So there is relatively little uncertainty for most digital signals about how much protection they need at the receiver.
However, for CDMA systems such as 2G and 3G cellular, I/S translates directly into cell site capacity as the cellular industry repeatedly reminded the FCC during the UWB[2] and Interference Temperature[3] rulemakings. This is because of the nature of CDMA where the receiver sees multiple signals overlapping in frequency and sorts them out by processing gain.
I/S Field Strength at the Antenna vice I/S Power at the Receiver
In the past, mobile antennas were omnidirectional and other antennas had fixed patterns. In such a scenario, one could reasonably consider the worst case transfer of I/S field strength ratio at the antenna to I/S power ratio at the receiver to be the same. However, MIMO antenna technology is now well established in the commercial world and will be even more important in the future. MIMO and other adaptive antenna technologies can readily change the I/S ratio at the receiver by preferentially passing the desired signal and not the interference. Thus for system that either use or can be reasonably be expected to use such technology any radio rights regime will have to consider how much to budget for I/S reduction attributable to the antenna system.
Propagation Models
Assuming one know what I/S ratio at the receiver antenna would be acceptable, how would one translate that into acceptable geometries and transmitter power for the new entrant? This involves projecting geometry and power with both scenarios and propagation models.
Propagation would be simple if all radio waves behaved like light in a vacuum with monotonic predictable decrease in field strength with path length increases. It doesn't always. Thus agreeing on a propagation model is a major issue. This was shown in the TV White Space proceeding[4] where the broadcast interests kept insisting on the R-6602 model[5] that is the basis for the FCC's Grade B contours. The "66" part of "R-6602" comes from 1966 - the year the model was first published prior to the computer age.
So realistic radio rights will depend on realistic propagation modeling - something that segments of the wireless community would like to avoid if traditional models give them a better position.
MCL vice Stochastic Modeling
The issue of "minimum coupling loss"/MCL is also key in any radio rights formulation. In the AWS-3 proceeding, incumbent licensees argued that protection from proposed AWS-3 band TDD emitters to incumbent lower adjacent band FDD downlink mobile receivers must be based on MCL - the worst case scenario. In stochastic modeling, geometries are considered along with their probabilities yielding a probability estimate for interference. Generally incumbents prefer MCL analysis as it precludes any probability of interference independent of any public interest factors. New entrants, on the other hand, would like to show that interference is minimal and does not meet with the "seriously degrades, obstructs, or repeatedly interrupts" part of the harmful interference definition.
At present FCC does not have a clear policy on when MCL or stochastic models should be used. This policy absence was a definite factor in the prolongation of the AWS-3 deliberations.
It appears that NTIA insists on MCL for all "safety services" although it is unclear if NTIA considers all federal government spectrum use to be a safety service or not.
Minimum Protection Distance
A close relative of the MCL issue is the question of minimum protection distance. Or how physically close a new entrant might be in space to an incumbent's receiver. Since propagated radio signals' strength is often proportional to 1/dn where d is distance from transmitter to receiver and 2<n<4, simple math shows that as dÕ0 the received power become infinite! In the real world there are either minimal physically possible distances or minimum distances beneath which a user is causing interference to only himself.
In the Commission's landmark 1979 decision on regulations of unintentional emissions from PCs and other "digital devices" stated that "we are assuming that the home computing device is at least 10 meters from the receiver. The separation distance is a basic parameter in computing tolerable levels of signal that may be radiated by a computer." And then picked an emission level that would not cause interference to TV receivers at 10m distance even though industry recommended a 30m minimum protection distance.[6] In the UWB case, FCC limits were based on an assumption of 2m minimum separation distance between portable UWB transmitters and GPS receivers[7] and 1.8m away from PCS receivers.[8]
Acceptable Interference Statistics
What does the harmful interference definition mean with respect to interference that "seriously degrades, obstructs, or repeatedly interrupts"? This has rarely come up in FCC deliberations, but was a key issue in the MVDDS/Northpoint proceeding. DBS satellite systems have natural outages that result from excessive desired path lost during heavy rain. For typical home antennas in the Washington DC area, this comes to about 120 minutes/year. The DBS operators argued that any increase in outages over this naturally occurring level would be harmful interference. The Commission ultimately decided that de minimis increases would not be harmful and based it technical rules on an objective of increasing rain related outages by no more than 10%[9] over the naturally occurring outages - actually the naturally occurring outages predicted by a standard ITU-R model.[10]
While the Commission tried hard to limit this outage increase precedent to only the MVDDS issue at hand, this is a key point in radio rights. Is interference caused by any new entrant that is de minimis with naturally occurring outages in space or time really "harmful"? While most incumbents want to believe that they have perfect coverage within their nominal service area, in most cases they don't due to factors such as multipath propagation, terrain shielding, technical limits of real receivers.
[1] NTIA Report 05-432, p. ii
[2] ET Docket 98-153
[3] ET Docket No. 03-237
[4] ET Docket 04-186
[5] J. Damelin, W. Daniel, H. Fine and G. Waldom Development of VHF and UHF Propagation Curves for TV and FM Broadcasting, FCC, Office of Chief Engineer, Research Div., Report No. R-6602, September 1966 http://www.fcc.gov/oet/info/documents/reports/R-6602.pdf
[6] Report and Order, Docket 20780, (Sept. 18, 1979), 79 F.C.C.2d 28, at para. 53
[7] UWB R&O at para. 107
[8] ibid., at para. 162
[9] Memorandum Opinion and Order and 2nd Report and Order, Docket 98-206, April 11, 2002 at p. 29
[10] In reality, heavy rain statistics vary greatly from year to year. Also the ITU-R model only has data for a grid points about 60 miles apart and uses linear interpolation between data points. As a result actual rain rates and the resulting satellite outages during a given year at a given place may vary widely from the predicted data and the real impact of a 10% increase would be impossible to differentiate over the base case.