By Topic

IEEE Quick Preview
  • Abstract

SECTION I

INTRODUCTION

As economies and societies have become more mobile and information centric there has been an explosion in mobile radio use, and wireless infrastructure has become a key national infrastructure along with the more traditional wired communications infrastructure and power, transportation, and water grids. Spectrum availability, along with infrastructure investment and efficient technology, is key in providing the capacity needed for this wireless infrastructure. However, measurements of spectrum use consistently show that actual spectrum utilization is actually low when averaged over space and time. There are many reasons for this, including the impact of terrain, spatial nonhomogeneity of demand, and the need to set aside spectrum for military and safety-related uses based on peak demand rather than average demand.

Wireless technology is much more regulated at both the national and international levels than other technologies in the fields of IEEE. The detailed regulation of wireless technology and radio spectrum access (RSA) techniques means that wireless technical innovators should view national and international regulatory requirements as being just as limiting in the near-term as Maxwell's equations in that new technologies will not achieve practical use unless they are consistent with regulatory requirements. However, in contrast to the laws of physics, regulatory requirements do evolve over time, although at a much longer time constant than that associated with technical evolution, e.g., “Internet speed.” For example, major regulatory changes have implementation times in the 3–10 year range depending on their complexity if they can be implemented at all.

When Marconi built his first outdoor transmitter in 1895 there was neither spectrum regulation nor interference. However, with the construction of the second transmitter–receiver pair, radio interference became an issue for the first time and has remained with us. Realizable transmitters and receivers can not be strictly limited to bandpass using brick-wall filters, so interference can never be eliminated, rather it can be controlled to an acceptable level. Once tuned circuits were developed in the early days of radio it was possible to implement frequency-division multiple access (FDMA). FDMA became the only way of sharing spectrum among users for most of the history of radio technology and implicitly became the focus of radio regulation.

Radio regulation began at the international level with the First International Radio Telegraphic Conference in Berlin, Germany, in 1903 which was attended by nine countries [1]. At that time, a ship radio telegraph was the main focus of practical radio use and of potential regulations and the key nontechnical policy issue was the market power of Marconi's company. However, actual regulation at the international level was not achieved until the Berlin Convention of 1906 which became a treaty among its signatories. In the United States, formal regulation of radio began with a law in 1910 that only dealt with ship radio issues.

Much has happened in radio technology and policy since these early days but FDMA use of spectrum has remained the dominant usage. Even in cases where cellular carriers use time-division multiple access [TDMA; e.g., Global System for Mobile Communications (GSM)], code-division multiple access (CDMA; e.g., IS-95), or orthogonal frequency-division multiple access [OFDMA; e.g., long-term evolution (LTE)] to share a band among multiple users, FDMA with long-term frequency assignments has been the dominant method to keep operators from causing interference to each other.

Regardless of the root causes of low actual spectrum utilization, it is important to increase spectrum utilization in order to build the wireless infrastructure essential to today's societies and economies. This is resulted in a surge of interest in new radio spectrum access technologies and policies.

SECTION II

FCC SPECTRUM POLICY TASK FORCE

A major milestone in the regulatory considerations of new approaches to RSA was the 2002 Spectrum Policy Task Force (SPTF) of the Federal Communications Commission (FCC), the U.S. national regulator for private and local government spectrum use. This was a comprehensive review of U.S. spectrum policy by FCC staffers with broad public input [2] including both multiple public fora and 435 written comments that were filed.

In its November 2002 report, SPTF concluded that [3]:

  • advances in technology create the potential for systems to use spectrum more intensively and to be much more tolerant of interference than in the past;
  • in many bands, spectrum access is a more significant problem than physical scarcity of spectrum, in large part due to legacy command-and-control regulation that limits the ability of potential spectrum users to obtain such access;
  • to increase opportunities for technologically innovative and economically efficient spectrum use, spectrum policy must evolve toward more flexible and market-oriented regulatory models.

In particular, the report urged FCC to consider more flexible forms of RSA such as cognitive radio systems to detect passively the presence of unoccupied spectrum that could be used as well as interruptible spectrum in which spectrum intended for primary public safety use that had high peak to average usage ratios could be utilized by other users while it was idle subject to preemption when it was needed for the primary public safety application.

SECTION III

5-GHz DYNAMIC FREQUENCY SELECTION

A. International Action

The first International Telecommunication Union (ITU) action on RSA was at the 2003 World Radio Conference (WRC-03) and built upon a resolution adopted at the 2000 World Radio Conference to study RSA approaches for bandsharing at 5 GHz. WRC actions result in modifications of the ITU Radio Regulations that have the status of a treaty when they are ratified by the ITU's member nations. Note, however, that under Article 4.4 of the ITU Radio Regulations member states have the option of taking other spectrum actions if they do not cause harmful interference to other members who are in compliance with the regulations [4]. ITU members are enabled by these WRC actions but are not generally required to implement them domestically. Some ITU member nations have chosen to interpret the ITU Radio Regulations as the top layer in a spectrum regulatory hierarchy that must be complied with while some have a more literal interpretation of the text of Article 4.4 that enables actions beyond the text of the regulations if they do not impact other countries adversely.

Resolution 229 of WRC-03 dealt with the “(u)se of the bands 5150–5250 MHz, 5250–5350 MHz, and 5470–5725 MHz by the mobile service for the implementation of wireless access systems including radio local area networks” [5]. These three bands are allocated to a variety of radio service on a primary basis and there was great interest in industry in several countries to expand upon the success of WiFi by permitting unlicensed devices to access these bands. While low-power wireless local area network sharing of the band was feasible with respect to some of the incumbent users of the band, it posed major technical problems for radar systems, called “radiodetermination” in spectrum policy nomenclature. Resolution 229 found that “that studies have shown that sharing between the radiodetermination and mobile services in the bands 5250–5350 MHz and 5470–5725 MHz is only possible with the application of mitigation techniques such as dynamic frequency selection (DFS)”—an RSA technique.

Resolution 229 describes the DFS needed for interference-free sharing of these bands in terms of certain technical parameters while building upon a framework established earlier in ITU Radiocommunications Assembly Recommendation ITU-R M.1652 [6]. The ITU-R recommendation describes DFS as a listen-before-talk (LBT) system that checks whether a frequency is in use before an unlicensed device can access it. The recommendation incorporates nominal signal characteristics of various radar systems that must be detected from another recommendation [7].

The basic technical details include two levels of LBT detector sensitivity that are tied to the maximum transmitter power of a device. Recommendation M.1652 states, “(t)he DFS mechanism should be able to detect interference signals above a minimum DFS detection threshold of −62 dBm for devices with a maximum e.i.r.p. of < 200 mW and −64 dBm for devices with a maximum e.i.r.p. of 200 mW to 1 W averaged over 1 Formula$\mu$s.” An unlicensed device must check a frequency for 60 s prior to using it and must avoid using a channel for 30 min if it detects a signal greater than the applicable threshold for more than 1 Formula$\mu$s. This combination of parameters appears to have been designed to err on the side of caution as any 1-Formula$\mu$s noise burst on a frequency places that frequency out of bounds for the next 30 min.

B. Implementation by National Regulators

FCC implemented the 5-GHz DFS rules domestically in January 2004 calling the units implementing these requirements “unlicensed National Information Infrastructure (U-NII) devices.” In doing so FCC stated, “We anticipate that the additional spectrum we are making available for U-NII devices will allow the continued growth in marketing, deployment and use of unlicensed devices. It will help meet the needs of businesses and consumers for fixed and mobile high-speed digital communications. We believe it will also stimulate the availability of broadband service to those who do not yet have it, and will increase competitive choices for those who do” [8].

The European Communications Committee of the European Conference of Postal and Telecommunications Administrations (CEPT), the confederation of European national spectrum regulators, implemented the 5-GHz DFS system for RSA in November 2004 [9].

C. Operational Experience

Despite the apparently conservative nature of the DFS rules adopted by the ITU and national regulators, there have been multiple incidents of interference in the United States to radars from 5-GHz U-NII devices that are required to use DFS. Two reports on the interference incidents by the National Telecommunication and Information Administration's Institute for Telecommunication Science [10], [11] have been written on this interference, but it is uncertain as to what the root cause of these interference incidents has been. All the incidents involve the Terminal Doppler Weather Radar (TDWR) system used to detect severe storms near airports that could endanger aircraft takeoffs and landings.

These interference cases all involved high fixed antennas operated by wireless network operators. As a temporary solution to these problems, FCC has made a nonregulatory voluntary agreement with the Wireless Internet Service Provider Association (WISPA) to urge network operators using U-NII devices subject to the DFS requirement to manually check a database of locations and adjust their equipment to stay at least 30 MHz away from the TDWR frequency if they are within 35 km of an operating TDWR [12].

Subsequent testing has shown that a properly designed DFS system should have detected the TDWR frequency and avoided cochannel transmissions [11]. While there are no official statements as to what then was the cause of these interference incidents, a likely explanation is software security problems in the software-designed radio implementation of the DFS algorithm that may have allowed the unit operator to bypass the DFS algorithm and maximize frequency availability at the risk of causing radar interference.

The repeated cases of harmful interference to a safety-related radar system are both a major embarrassment to the proponents of RSA and a clear signal that RSA systems must be designed with sufficient robustness to avoid causing such interference over the lifetime of the systems even if the unit operators try to bypass built-in safeguards or if the equipment degrades with time.

SECTION IV

TELEVISION WHITE SPACE

The regulatory issue of TV white space was first raised during the FCC SPTF deliberations in 2002. Noticing the need for more intense spectrum use, the inevitable existence of spectrum in TV broadcast bands that contained no usable TV signals, and the advances in cognitive radio technology, the SPTF recommended that FCC “(c)onsider methods for additional spectrum access for unlicensed devices, which include: …Opportunistic or dynamic use of existing bands—through either cognitive radio techniques to find ‘white space’ in existing bands or use protocols to get out of the way of primary users” [3]. This began one of the most contentious technical policy deliberations in recent FCC history.

The original FCC proposals [13] involved low-power use of TV spectrum that was identified as being idle in a given location through any one of three possible techniques:

  1. an LBT detector significantly more sensitive than normal TV receivers;
  2. geolocation of the transmitter location through a means such as global positioning system (GPS) followed by communications with a database which indicated what frequencies were available for low-power use at that location;
  3. use of low-power short-range local beacons that indicated what TV channels were available for use in areas the beacon could be received.

The third option attracted virtually no interest in public comments. The LBT option generated much greater interest and several prototypes were submitted to FCC for two rounds of testing [14], [15]. The testing showed that the best prototypes had 90% reliability detection thresholds of better than −120-dBm input signals of the US ATSC digital television standard. This is 35 dB more sensitive than the typical consumer DTV sensitivity of −85 dBm and is the processing gain of the detector. Such high processing gain makes it likely that the LBT system would have high-reliabilty interference avoidance in places where received TV signals have high location variability. (Note that if there was no location variability and signal strength of TV signals decreased monotonically with distance, a small processing gain would be adequate and there would be little technical controversy on interference issues.)

A major complication in determining whether LBT would be feasible in the United States context was the existence of wireless microphone systems in interstitial TV channels. While wireless microphones on vacant TV channels had been explicitly authorized by FCC for use by broadcasters and for film production units [16], in practice wireless microphones were also de facto used without FCC formal authorization for live entertainment in theaters and concerts and for conference room applications. Indeed, such wireless microphones had become almost an integral part of live theatrical productions.

LBT detection of wireless microphones in typical environments was impractical because of the lack of standards for the modulation of such systems and the lack of tight frequency tolerances precludes design of high processing gain detectors. Furthermore, the possibility of an adverse near/far ratio due to audience members using an LBT-equipment device near a wireless microphone receiver operating at maximum sensitivity additionally complicates LBT detections of wireless microphone systems. Together these precluded the development of practical LBT detectors to ensure reliability interference avoidance with respect to widely used wireless technology.

Recognizing this difficulty in LBT detection of wireless microphone signals along with other issues, FCC declined to authorize directly LBT detectors and chose to permit white space devices using geolocation and database lookup. (FCC did adopt an unprecedented provision allowing it to consider specific LBT-based systems for approval if the developer “demonstrate(s) with an extremely high degree of confidence that they will not cause harmful interference to incumbent radio services” [17]. It is unclear if this ill-defined “demonstration” requirement is practical under the procedural terms given by FCC in its decision.

As of the writing of this paper, FCC has accepted applications from ten entities to operate the database required for geolocation-based operation, but has not approved any yet. Similarly, it has not approved any hardware to operate in this band. However, both are expected in early 2012.

The U.K. spectrum regulator Ofcom announced a consultation in November 2010 for TV white space use based on geolocation. Ofcom had been considering the general issue of white pace use, “interleaved channels” in their nomenclature, since 2007. In September 2011, Ofcom announced the results of its consultation and additional issues for public comment [18]. Ofcom is proceeding with consideration of geolocation for white space use in the United Kingdom and has authorized two field trials that are underway to gather operational data.

A major difference between the U.K. approach and the U.S. approach is the type of radio propagation model used to compute the information in the database on permitted channels at specific locations. The FCC decision uses its R-6602 propagation model [19] that was used traditionally for licensing full power TV stations, although it has used alternative models of TV coverage in other contexts. The R-6602 model was developed in 1966 and for ease of computation in this era before ubiquitous computation capability uses macroscopic measures of terrain roughness and simple correction factors for terrain. Together these tend to give unrealistic results in many cases of rough terrain [20]. Ofcom has not explicitly stated what propagation model it plans to use, but has indicated that it will use approximations to actual coverage based on models maintained by Arqiva, the operator of TV transmitter systems in the United Kingdom.

In August 2011, Industry Canada (IC), the Canadian spectrum regulator, also issued a consultation on TV white space issues [21]. The IC approach is similar to the approaches that FCC and Ofcom are pursuing in using databases to determine what frequencies can be used at a given location. However, IC gives no indication as to what type of propagation model it is intending to use.

The Japanese spectrum regulator Ministry of Internal Affairs and Communications (MIC) has been supporting research on cognitive radio since 1997 [22] and considering related policy issues since 2003 [23]. MIC has also been deliberating on TV white space for several years and released a detailed report in 2010. It identifies several possible applications of white space from special broadcasts on relocation areas after a disaster of providing tourist information to visitors. The report states that “a “White Space Promotion Conference” …(needs) to be set up that consists of concerned parties such as manufacturers, broadcast business operators, and telecommunication business operators, etc. in continuing to aim at nationwide deployment of white space utilization” [24]. The report also plans on the creation of “specific white space districts” where different experiments on white space utilization could be done that are tailored to local circumstances such as terrain and incumbent signal use. Activity in these areas is expected in 2012.

One author attributes the difference between the U.S. and Japanese approaches to RSA to a preference in MIC to focus on consensus development among traditional spectrum users before making major spectrum policy changes [25]. By contrast, FCC has taken a more leadership role in addressing the “chicken and egg problem” aspects of RSA policy.

SECTION V

ITU AND THE GENERAL CASE OF RSA

ITU WRC-07 adopted Resolution 956 dealing with “Regulatory measures and their relevance to enable the introduction of software-defined radio and cognitive radio system” [25]. This resolution invited the ITU-R to study whether there is a need for regulatory measures related to study “the application of cognitive radio system technologies…(and) whether there is a need for regulatory measures related to the application of software-defined radio.” In addition, this resolution became Agenda Item 1.19 for WRC-12.

In 2007, ITU-R adopted a report on software-defined radio that also considers RSA issues [26]. This report concluded that cognitive radios should be deterministic and must follow a set of rules that are “regulatory in nature.”

While proponents of RSA may have seen Resolution 956 as an entrée for RSA into the international forum, the results have not been necessarily productive. Traditional frequency allocations and regulations have focused on “services,” not “technologies,” and the wording of Resolution 956 was thus in conflict with long traditions. Preliminary views of both the United States [27] and United Kingdom [28] for WRC-12 conclude that no ITU action is needed at this time. The WRC-12 Conference Preparatory Meeting concluded:

“A common concern within the ITU-R is the protection of existing services from potential interference from the services implementing CRS technology, especially from the dynamic spectrum access capability of CRS.

In addition, a service using SDR and/or CRS should not adversely affect other services in the same band with the same or higher status. Thus, the introduction and operation of stations using SDR and/or CRS technologies in systems of any radiocommunication service should not impose any additional constraints to other services sharing the band” [29].

Thus, the attempt to bring RSA into the ITU forum has had limited success other than recognizing RSA as a legitimate technology. For the foreseeable future, regulation of RSA will be at the national level and will go faster in those countries whose national spectrum regulators are more sympathetic to this technology. However, if a major non-ITU international standards group adopts RSA for a specific new standard then the pace of introduction could speed significantly.

SECTION VI

PASSIVE SENSING VERSUS COOPERATIVE SHARING

In any RSA policy deliberation, key issues are the efficiency of utilization of idle spectrum and the risk of interference from the RSA-based users to incumbent traditional FDMA users of the same spectrum. A fundamental difficulty is that these two goals are basically in conflict. As in classic detection theory where there is a tradeoff between probability of detection and probability of false alarm, there is a tradeoff in RSA use between intensity of spectrum use and risk of interference to incumbent FDMA users. In an LBT system, this follows from detection theory as one must set the LBT threshold low enough to detect any cochannel primary signal with high confidence and yet have it high enough that it does not have excessive signal detection false alarms due to either spectral noise or signal artifacts that arise because of imperfect receiver design, e.g., intermodulation products.

Modern receivers based on cyclostationary feature detection [30] can improve this tradeoff using information based on the details of the primary signal. Thus, they are not looking for just the presence of power in a given channel, but also power of a specified modulation. However, even in such feature detection receivers, one must ultimately pick a detection threshold that involves the tradeoff between the Scylla and Charybdis of leaving available spectrum idle or risking interference to primary incumbents.

An alternative to this dilemma is to engage the incumbents directly in deciding when and where spectrum access is possible in a policy environment where they benefit from allowing others access to idle spectrum. Until recently in the history of radio technology all radio licenses were awarded without an explicit cost and in many cases without an annual fee. Generally, spectrum licensees had no incentive to make their idle spectrum available to other users. However, new regulatory provisions generally called “secondary markets” or “spectrum leasing” [31] in certain countries allow some spectrum users to benefit financially from giving consent to other users for use of their spectrum either on a static long-term basis or a dynamic RSA basis. In the pragmatic sense, the availability of this option decreases the incentive of incumbents to wage regulatory battles that are lengthy compared to evolution cycles of wireless technology.

However, wireless systems designed for exclusive spectrum may not be optimal for spectrum sharing even on a consensual basis. Better sharing performance is possible with wireless systems in which sharing was included in the basic design and in which any marginal cost for such sharing provisions is paid by the RSA users who benefit from them. Thus, new trunked and cellular wireless mobile systems might make information available on the location and time dynamics of idle spectrum that could be leased in real time. Similarly, new radar systems might be optimized for sharing with modulation and antenna designs that are less susceptible to cochannel interference and make available in real time to potential spectrum sharers antenna location and beam azimuth information.

The most rapidly growing sector of wireless use is not voice telephony with its strict latency requirements and constant throughput, but asymmetric data flows with widely varying rates. Packetized communications systems in many cases could use RSA because of the time dynamics of spectrum access requirements which is not fixed as in classical FDMA operations.

SECTION VII

LOOKING AHEAD

Much of the interest in RSA comes from the need to meet increasing wireless demands in today's economies and societies and the recognition that traditional spectrum access techniques usually result in low utilization when averaged over space and time. Thus, RSA is a promising tool to increase real spectrum utilization to support economic growth and evolving societal needs.

The major obstacle to RSA use has been incumbent users who fear interference and in some cases new competition. At times, RSA advocates have not been sensitive enough to these concerns and pragmatic in dealing with them. Incumbent spectrum users both have a significant stake in spectrum use as well as major influence in national and international policy fora. Spectrum policies that reward incumbents for supporting more intense use of spectrum may be one method of making the interests of incumbents better aligned with goals of more intense spectrum use. While passive monitoring systems for determining spectrum availability are possible in some applications, active engagement of incumbents in real-time determination of spectrum availability will result in cooperative systems that both decrease incumbents' legitimate concerns as well as maximize the amount of spectrum available for RSA for a given interference risk probability.

It is difficult to build cooperative systems as modifications to incumbent systems designed for exclusive spectrum use. Thus, standards and regulatory policies that encourage spectrum users to make data available on instantaneous spectrum use and expected changes in use would facilitate cooperative RSA systems and also help protect incumbent use.

If incumbent opposition can be moderated through cooperative RSA systems or if national and international regulators take a bolder approach in maximizing public interest in spectrum by insisting on spectrum access techniques that use the resource more intensively, then RSA will serve a key role in facilitating more intense spectrum use and its economic and societal benefits.

SECTION VIII

CONCLUSION

RSA is an example of an innovative wireless technology whose implementation depends greatly on spectrum policy deliberations at the national and international level. Increased participation by wireless researchers in national and international spectrum policy fora can improve the quality of these deliberations and expedite the search for compromise solutions that balance the benefits of new technology with the rights of incumbent spectrum users.

Acknowledgment

The author would like to thank his many former colleagues at the Federal Communication Commission for their help over the years in explaining the practical and regulatory aspects of wireless systems as well as the new policy options under consideration.

Footnotes

The author is with Marcus Spectrum Solutions, LLC, Cabin John, MD 20818 USA and also with the Department of Electrical and Computer Engineering, Virginia Tech, Blacksburg, VA 24061 USA (e-mail: mjmarcus@marcus-spectrum.com).

References

No Data Available

Authors

Michael J. Marcus

Michael J. Marcus

Michael J. Marcus (Life Fellow, IEEE) is a native of Boston, MA. He received the S.B. and Sc.D. degrees in electrical engineering from the Massachusetts Institute of Technology (MIT), Cambridge.

While at MIT, he participated in the cooperative education program with assignments at Bell Labs. After service in the U.S. Air Force working on underground nuclear test detection research, he worked at the Institute for Defense Analyses on electronic warfare analysis. He then spent nearly 25 years at the Federal Communications Commission where he specialized in spectrum policy issues for innovative technology. He proposed and directed what became the 1985 FCC decision that set the foundation for WiFi and Bluetooth and later worked on the frameworks for the 60-, 70-, 80-, and 90-GHz commercial bands. As a Mike Mansfield Fellow he spent a year at the FCC's Japanese counterpart and has also been a consultant on spectrum policy to the European Commission. He has taught at George Washington University, and MIT, and is now an Adjunct Professor at Virginia Tech, Blacksburg. He is the Director of Marcus Spectrum Solutions LLC, Cabin John, MD.

Dr. Marcus is Vice Chair of the IEEE-USA Committee on Communications Policy and Cochair of the Federal Communications Bar Association's Engineering and Technology Practice Committee. He received IEEE-USA's first Electrotechnology Transfer Award and was elected an IEEE Fellow in 2004 “for leadership in the development of spectrum management policies.”

Cited By

No Data Available

Keywords

Corrections

None

Multimedia

No Data Available
This paper appears in:
No Data Available
Issue Date:
No Data Available
On page(s):
No Data Available
ISSN:
None
INSPEC Accession Number:
None
Digital Object Identifier:
None
Date of Current Version:
No Data Available
Date of Original Publication:
No Data Available

Text Size