Maritime IoT: An Architectural and Radio Spectrum Perspective

,


I. INTRODUCTION
The oceans cover more than 70 percent of the earth's surface. This expansive marine area represents a prime international domain for activities such as maritime transportation. For instance, the international shipping industry transports about 90 percent of world trade. The concept of maritime Internet of Things (IoT) is originally developed to modernize the maritime industries by the United Nations chartered International Maritime Organization (IMO) under the name e-Navigation [1]. At the heart of the maritime IoT lies the machine-type communication (MTC) due to the need for establishing the communication between the shore and vessels or maritime equipment (i.e., the ''things'') as well as among vessels to support various types of maritime IoT applications and services.
Although the infancy and youth of radio communication were mainly linked to maritime applications, advancements in the maritime communication technology is severely lagging behind its land/terrestrial counterparts (e.g., 3GPP LTE). While wireless communication technologies have undergone revolutionary developments and breakthroughs, and terrestrial communications are evolving into the 5th generation (5G) [2], maritime communication still has been mostly The associate editor coordinating the review of this manuscript and approving it for publication was Liangtian Wan . relying on voice communications for the transfer of the information for the safety of navigation. More efficient communication solutions are sorely in need for the transfer of more maritime information, allowing timely decision making, and effective mitigation of the remoteness of the maritime activities and operations that will lead to safer and more efficient voyages. Only in recent years, maritime communication is slowly gaining momentum in modernizing maritime mobile services [3], [4]. In particular, the International Telecommunication Union (ITU) introduced the Automatic Identification System (AIS) to provide vessel identification, and position reporting and tracking [5]. During the last decade, the IMO has popularized AIS for the exchange of navigational data between ships and between ships and shore stations to improve the situation awareness over voice, sight, and radar [6], [7]. Despite its limited data communication capacity and lack of a comprehensive architectural framework for addressing all aspects of maritime IoT applications and services, AIS can be deemed as a primitive maritime MTC system. Progresses in the maritime domain for the modernization and mobilization of maritime-related businesses and researches continue to challenge the legacy maritime communication systems which have inevitably shown their incompetence in meeting the ever-increasing demand from the world's maritime services sectors in terms of ubiquity, continuity, heterogeneity, and scalability [8]- [10]. VOLUME 8, 2020 This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ As we will see in the following section, maritime MTC has unique requirements that are different from other MTC technologies (e.g., NB-IoT [11]) which have been focusing solely on the terrestrial IoT applications. So far, as mentioned earlier, there is very limited literature available on maritime MTC technologies although there are some excellent papers related to the maritime communication systems. For example, an improved distributed scheduling scheme for AIS is evaluated in [12]. References [13]- [15] study the feasibility of AIS for long-range maritime traffic monitoring via satellite. An implementation of real-time long-range AIS to cope with the message collision challenge is presented in [16], and an AIS satellite constellation with low communication latency is introduced in [17]. The maritime communication link waveforms are studied in [18], and the channel models and the statistical results are presented in [19]. The measurement and performance evaluations of a maritime communication network are discussed in [20]- [22]. The use cases of maritime communication and research directions are touched upon in [23]. In [24], the potential interference of satellite downlink to terrestrial communication systems is examined. However, none of these papers provides in-depth analysis of the maritime MTC architecture and the radio spectrum challenge. Indeed, the radio spectrum is undoubtedly the most critical element for any practical wireless communication systems, and is more so for maritime MTC whose worldwide existence depends on the internationalization of the radio spectrum. Without it, maritime IoT will remain a mirage.
In this paper, we first introduce the maritime MTC concept and describe the underlying architecture. We then examine in detail the radio spectrum that has been officially approved by ITU most recently for maritime IoT. We explain the rationale behind the allocation and the implications on the regulatory constraints and air interfaces. Finally, we propose the spectrum sharing and interference management mechanisms for such unique spectrum, which provide insight into the design and standardization of the maritime MTC system operating on this spectrum.

II. THE MARITIME MTC CONCEPT
MTC is a form of data communication that involves one or more entities that do not necessarily need human interaction or intervention, and maritime MTC is simply a type of MTC with a specific application in maritime IoT in which most of the use cases or services require little or no intervention of humans and to work even in the absence of human operators [24].
Examples of such maritime IoT services include search and rescue (SAR) in which a maritime MTC transponder embedded in SAR equipment enables the communication between the equipment and maritime rescue coordination center or ships in the vicinity, providing precise location, weather condition, and other information that helps the SAR operation. Evidently, this type of device should require minimum human intervention due to physical incapacitation, for instance.
An aids-to-navigation device (e.g., a buoy or lighthouse) may use a maritime MTC device to provide precision piloting to passing vessels in areas such as dangerous coastlines and channels, and hazardous shoals and reefs. Similarly, via the maritime MTC network, maritime safety information services provide vessels with navigational warnings, meteorological forecasts, and hydrographic services, among other safety-related information.
In ship reporting, a ship periodically broadcasts its static and voyage related information such as the ship's identification, draught, vessel type, its intended destination, and estimated time of arrival, or dynamic information such as position, speed over ground, course over ground, and navigational status. This allows tracking and monitoring of vessels and maritime devices worldwide.
Container tracking allows for geo-locating a specific container aboard a cargo vessel, and even remotely monitoring the internal conditions so that all parties involved in the shipping process (e.g., liners and logistic providers) can reap the benefits that come with knowing the whereabouts and the conditions of the assets when making an oceanic voyage where there is no shortage of uncertainty. Real-time cargo tracking and tracing relying on maritime MTC have thus become vital for today's maritime service operators.
Autonomous shipping may be the ultimate way forward; route exchange, however, is the coveted and viable solution to vessel collision, in which ships in close proximity coordinating and optimizing their routes autonomously so that close quarter situations can be predicted and avoided at an early stage.
Smart navigation triggers the growing need for better ''visibility'' of operations at sea. Shipowners and fleet operators may monitor fuel consumption and machinery performance to improve maintenance and vessel operations, prevent equipment failures and poor performance, and reduce downtime. With IoT, vast amounts of data coming from heterogeneous sources are made available for technologies like artificial intelligence, big data, and blockchain to harness and leverage the full potential of data, turning them into trusted and actionable intelligence for making data-driven decisions and optimizing operational efficiency at all levels to reduce the ecological footprint.
Indeed, ocean activities and processes have direct environmental impacts and human implications. Yet the understanding of the ocean system is still far from complete, and ocean research infrastructure is much needed to support both fundamental research and societal priorities. Maritime IoT is expected to play an essential role in meteorological and oceanographic information collection via maritime sensor networks for monitoring, studying, and protecting the marine environment.
Clearly, maritime IoT possesses several key distinguishing characteristics from its land counterparts, which have a direct impact on its communication system architecture.

A. UBIQUITOUS CONNECTIVITY AND SERVICE CONTINUITY
First, a maritime MTC system is required to provide ubiquitous connectivity between vessels and ashore maritime networks on a global scale, especially over open oceans, including the most remote areas of the world, like the Polar Regions, to ensure unbroken and consistent existence of maritime services. Currently, the presence of services in offshore settings is limited by a lack of information and communication infrastructures.

B. TRAFFIC NON-UNIFORMITY
Despite its global nature, marine traffic is highly unevenly distributed. Heavy traffic concentration is typical in ports, near-shore, and waterways. For instance, coastal shipping accounts for more than 50 percent of the total maritime transport of goods to and from the main ports of the coastal nations [25], where the cargo ships primarily follow the routes that are set close to the shore wherever possible while moving between trade ports, thereby causing severe traffic densification in nearshore areas. Quite the contrary, maritime traffic on the high seas is mainly from intercontinental transportation or deep-sea shipping, and is sparse in density. The maritime MTC system thus must have an efficient solution to cope with this extreme disparity in traffic density.

C. RADIO SPECTRUM INTERNATIONALITY
Last but foremost, the fundamental element for a maritime MTC system is the radio frequency spectrum, which is the most critical component for any practical wireless communication system, and even more so for maritime MTC because of its global coverage nature. Currently, the maritime services have always been in a campus-style deployment; crossregion continuity of maritime service remains inconsistent and even absent. Ultimately, a global cooperative maritime IoT network is essential for the undisrupted services across organizational, regional, and national boundaries, especially in times of crisis. To meet this goal, it is imperative to have appropriate international standards and regulations to enable proper operation of the system. Technical and regulatory challenges must be addressed by the international standards and regulatory bodies, as well as by the world's maritime community.

III. THE SYSTEM ARCHITECTURE
From the previous section, we see that the maritime MTC system has unique characteristics pertinent to maritime IoT, e.g., colossal network scope, dynamic topological structures, trans-regional mobility, and highly non-uniform traffic distribution. They signify a fundamental transformation of the traditional maritime communication concept and call for distinctive architecture from the legacy systems and its land counterparts. Figure 1 graphically illustrates such conceptual architecture. We have the following observations: first, only wireless solutions are applicable; second, both satellite and terrestrial communication networks exist; and third, both infrastructure and ad hoc communication topologies are present.

A. COMMUNICATION TOPOLOGIES
The infrastructure-based wireless communication relies on a pre-existing managed infrastructure consisting of ''control stations'' or access points connected to a core network or cloud. All communications are through the control stations in a ''star network'' topology. The control station provides a common connection point for mobile stations. A mobile station is an MTC terminal aboard a vessel or embedded in maritime equipment. The network-originated message is communicated from a control station over the downlink to the mobile station(s), and the mobile-originated message is communicated over the uplink to the control station. The uplink is thus a many-to-one communication link (i.e., multiple transmitters with one receiver). Multiple mobile stations share the link through a centralized resource allocation mechanism under which the control station distributes the radio VOLUME 8, 2020 resources (i.e., time and radio frequency) among competing mobile stations.
Centralized resource allocation is characterized by the principle that any communication taking place has to go through the network via control stations. The mobile station requests uplink transmission resources from the network over the uplink signaling channel. The medium access control in the control station includes a dynamic resource scheduler that allocates radio resources for both downlink and uplink data traffic, and signals to the mobile stations over the downlink signaling/control channels. In addition to maximization of spectral efficiency, the scheduler takes into account the traffic volume, the quality of service (QoS) requirement, the radio channel conditions, and fairness when sharing resources among mobile stations.
In contrast, ad hoc communication follows a ''point-topoint network'' topology. It is a decentralized type of wireless network that is composed of individual mobile stations communicating with each other directly, bypassing the central control station. The network is ad hoc because it does not rely on a pre-existing infrastructure. The communication resource or radio spectrum is shared among mobile stations through a distributed or autonomous resource allocation mechanism. Such allocation scheme typically involves each mobile station monitoring the transmission activities of neighboring mobile stations to avoid collisions at the destined receiver.

B. SPACE-EARTH INTEGRATED COMMUNICATION
Recalling that the first and foremost requirement of maritime MTC is the provision of ubiquitous connectivity between vessels and shore over open oceans to ensure maritime service continuity. This poses a unique and serious challenge since unlike in terrestrial cellular communication where wide area wireless coverage is provided via mass deployment of base stations, it is obviously unrealistic to cover the open oceans with such base stations. The solution is the deployment of a satellite MTC network to form a space-earth integrated maritime MTC system (see Figure 1).
In this maritime MTC system, both satellite and terrestrial networks are present to provide mobile stations with the ubiquitous connectivity to the maritime cloud -a trusted platform for providing diverse ubiquitous maritime services and applications with the highest computational and storage capacity in the maritime IoT framework. The federated network management combines the features of service provision, management, and orchestration with dynamic resource management/consolidation, service resolution, and forwarding mechanisms, through which the physical infrastructure resources are maximally shared among service providers, and are fine-tuned to meet the individual service requirements, thereby enabling service-centric networking.
The near-shore communication is through dense geographically-distributed shore stations, whereas offshore connectivity (e.g., on the high seas) is provided by space stations. The space station acts as a relay for communication beyond the reach of the shore stations.
Besides, a self-organized ad hoc network can be quickly and dynamically established to facilitate direct communication between mobile stations, particularly useful for maritime IoT proximity applications, such as aids to navigation and route exchange.

1) SATELLITE COMMUNICATION
Considering the usually harsh and remote environment of the open seas and the impracticality and/or overbearing costs of building terrestrial infrastructure, overall, major communication system solutions would need to be satellite-based. Satellite communication is the sole alternative at deep-sea operations out of range from any terrestrial solution. Unlike most terrestrial alternatives, a satellite communication system can be rolled out more quickly and economically, and hence more viable for connecting remote locations across a large geographic area due to the large footprint on the surface of the earth of a satellite transceiver, even after taking into account the significant resource and time-consuming processes of launching a satellite network.
However, satellite communication suffers from shortcomings, mainly: 1) inefficient to cover the high-dense traffic area because of the large footprint of a satellite space station and limited communication capacity; and 2) large transmission delay due to the high altitude of a satellite space station, which could be problematic for ships in close quarters in a dense traffic area. As such, there has been increasing interest in low earth orbit (LEO). As it is near to the earth (e.g., 600 km), LEO satellites launched in LEO orbit has a smaller footprint (∼1400 nautical miles in radius) than that for geostationary satellites. It has the least propagation delay (∼10 ms) compared to the other orbits. Continuous coverage is provided using multiple satellites configured in constellations with complex interlocking orbits [26]. The LEO satellite is connected to an earth station or teleport through a radio link called feeder link. The satellite component in Figure 1 is such an MTC satellite communication system.
Since the coverage of a station is ultimately limited by the radio horizon, which is determined by the antenna height, a satellite space station has a much larger field of view than a mobile station. Mobile stations within the field of view of a satellite are thus likely beyond the radio horizon of each other, and hence sensing-based distributed medium access control is ineffective in preventing collisions at the satellite receiver, whereas centralized medium access control via space stations is more efficient, and provides better system capacity.

2) TERRESTRIAL COMMUNICATION
Despite the reduced footprint and latency, a LEO satellite network is still not adequate to serve the high traffic locations like ports/harbors, coastal areas, and waterways due to the large footprint and limited capacity of a space station. Network densification, i.e., decreasing the footprint of a control station and increasing the density of control stations, is an effective approach for achieving capacity scaling in infrastructure-based networks. The satellite network is thus best ''hybridized'' with a wireless terrestrial component to cover locations that are unsuitable to serve for the satellite network.
In the hybrid model, the terrestrial network handles shortrange communications with clustered vessels near shore as a compliment to the satellite network. Owing to the much smaller footprint of a shore station, spatial spectrum reuse is better exploited for boosting the overall system capacity. In particular, the high-dense traffic area is partitioned into multiple small areas or ''cells'' covered by multiple shore stations, where a radio spectrum is reused by these shore stations, and hence the overall capacity is multiplied, as illustrated in Figure 2.

3) PROXIMITY COMMUNICATION
Direct communication between mobile stations is needed to support ad hoc communication without having a fixed infrastructure for proximity-based services. A peer-to-peer or mobile-to-mobile communication interface is needed for such a purpose. It operates in a distributed medium access control mode for self-organized networking, useful in the absence of satellite and terrestrial network coverage, and ideal for maritime IoT proximity services via a distributed medium access scheme.

IV. THE RADIO SPECTRUM
If maritime MTC is the backbone of maritime IoT, radio frequency spectrum is, by all means, the heart and soul of maritime MTC, whose worldwide existence depends on the ''internationality'' of the radio spectrum.
The radio spectrum is a natural resource, and natural resources contained within the geographic boundaries of a nation are generally owned by that nation. However, unlike its land MTC counterparts, the global coverage nature of maritime MTC requires the radio spectrum allocation on a global scale, rather than a campus-style. Therefore, the radio spectrum for maritime MTC must be allocated through an international regulatory agency, like the ITU -the United Nations' specialized agency for the world's radio spectrum coordination and regulations, through a lengthy standard process.
In this section, we examine the recent allocation of such spectrum that is dedicated to maritime MTC within the VHF maritime mobile band (156 to 174 MHz) [27], under the maritime MTC architecture described in the previous section. This allocation is depicted in Figure 3. The realization of such an MTC concept on this particular radio spectrum is referred to as the VHF Data Exchange (VDE) System or VDES for short.
This standard allocation seems to be nothing but ordinary and straightforward; however, there are many unique technical issues that warrant special attention and effective solutions, due to the coexistence of multiple maritime MTC air interfaces (including the legacy AIS) and potential interference to incumbent land systems.

A. PROXIMITY SPECTRUM
Under the maritime MTC concept, the spectrum for the maritime MTC proximity component has been allocated as part of the maritime MTC spectrum. Specifically, as shown in Figure 3a, frequency channels 2087 and 2088 in the VHF maritime mobile communication band are allocated to the legacy AIS system. It is mainly for direct communication between stations in an ad hoc networking structure, originally designed to provide a simple and low-cost means for maritime communication with minimum functionality. It can only be used for transmitting up to 64 pre-defined maritime messages, and yet no channel coding is employed.
Unlike channels 2087 and 2088, channels 75 and 76 are designated for AIS transmissions destined solely to satellite reception. The communication direction is from ships to a satellite AIS receiver. It is also known as the long-range AIS (LAIS).
Next to the two AIS channels are channels 2027 and 2028 dedicated to the new proximity communication air interface, ASM (Application Specific Messages). It is intended for the delivery of more versatile application-specific content with higher spectral efficiency via channel coding and rate adaptation to boost system capacity. The medium access control of ASM employs the distributed resource allocation scheme similar to AIS to support self-organized ad-hoc networking for maritime proximity services.

B. TERRESTRIAL SPECTRUM
Both AIS and ASM operate in a self-organized ad hoc communication fashion. The advantage is the great flexibility that enables direct communication without the presence of a control station; nevertheless, the disadvantage is also the lack of supervision of control stations, which gives rise to frequent collisions among transmitting stations in the clustered high traffic area. Collisions cause not only poor overall system efficiency that limits the system capacity, but also the instability of the system. The maritime MTC terrestrial component, denoted as VDE-TER, is thus intended to alleviate this problem with centralized resource management that relies on a managed infrastructure consisting of control stations (i.e., shore stations) connected to the maritime cloud network (see Figure 1).
As such, in the same VHF maritime mobile communication band, paired frequency channels 24, 84, 25, and 85 are allocated to VDE-TER, all as simplex channels, as shown in Figure 3 (a). A simplex channel allows for one-way transmission only, either uplink or downlink.
The lower leg (denoted as L TER ), i.e., channels 1024, 1084, 1025 and 1085, is for uplink transmissions, and the upper leg (U TER ), i.e., channels 2024, 2084, 2025, and 2085 for downlink. This traditional paired allocation facilitates frequency-division duplexing (FDD), which requires two simplex frequency channels at each communicating end, one for transmit and one for receive with the remote end configured as the opposite.

C. SATELLITE SPECTRUM
Apparently, vessels on the high seas or in the arctic region are not covered by shore stations. The VDES satellite component, denoted as VDE-SAT, is thus intended for addressing the ubiquitous network access requirement via LEO satellite space stations to extend the network coverage to the regions that are not reachable by the shore stations. As explained in Section II, centralized resource management is employed by the VDE-SAT network.
Spectrum allocation for VDE-SAT is more complicated than that for VDE-TER, mainly due to the potential satellite downlink interference to the incumbent land systems on the same frequency band.
As depicted in Figure 3 (a), paired channels 26 and 86, i.e., the lower leg, channels 1026, 1086 (denoted as L SAT ), and the upper leg, channels 2026, 2086 (U SAT ), are allocated for VDE-SAT, both simplex channels, facilitating the traditional FDD transmissions, with L SAT for uplink and U SAT for downlink.
There once was a spectrum plan proposing a significantly larger downlink bandwidth in the non-channelized maritime mobile communication band (160.9625-161.4875 MHz), which would give VDE-SAT 525-kHz worth of downlink spectrum [24]. Unfortunately, this plan was not favored by some ITU member states due to the concern of potential interference to certain sensitive land services in the same band, such as the radar systems in the Russian Federation, the communication systems used by the government agencies in Japan and by the railway systems in North America (USA and Canada).
Although the VDE-SAT downlink channel is within the VHF maritime mobile band (156-174 MHz), on land, this same band in most countries is allotted to conventional and trunked land mobile systems by safety agencies, utilities and transportation companies, e.g., police, fire, ambulance services, and dispatched services [28]. Many businesses and industries throughout the world use land mobile services as their primary means of communication, especially from a fixed location to mobile users (i.e., from a base station to a fleet of mobile stations). Therefore, coordination between the VDE-SAT systems and the victim systems (i.e., the incumbent land systems in this frequency band) is a matter of the utmost importance. 93114 VOLUME 8, 2020 The challenge lies in the fact that there is no existing regulatory rule directly established for protection of the land system against the space-borne system or the like, and hence the evaluation of the potential impact on the incumbent land systems becomes difficult if not impossible. A method adopted in the current analysis is to place general restrictions on the emissions from space stations inferred from the existing regulatory rules for interference protection between legacy land systems specified by ITU and Electronic Communications Committee (ECC) [24]. The restrictions are expressed in terms of values of the maximum allowed electromagnetic power flux density (PFD) emitted by any space stations to the surface of the earth at all possible incident angles in a reference bandwidth, which, in a nutshell, serves as a ''protection mask'' for the land system such that the actual interference that the land system experiences is no worse than that from a land mobile system permitted by these existing regulations. This important concept is graphically illustrated in Figure 4, henceforth referred to as the electromagnetic PFD mask. The PFD mask is derived from the original land system regulatory restriction and used as the constraint on the emission energy from the space stations for protection against the harmful interference to the incumbent co-frequency land communication systems.
Specifically, ECC recommends an indicative coordination threshold for interference avoidance among land mobile systems from neighboring countries. For the VDE-SAT downlink frequency band, the corresponding coordination field strength is 12dBµV/m per 25 kHz [29].
ITU recommends an energy per bit to noise plus interference power spectral density ratio of 10 dB for digital land mobile systems [30], and a typical total received power to the unwanted power (SINAD) value of 12 dB for analog land mobile systems for establishing degradation protection in the frequency band between 138 and 174 MHz [31].
These two corresponding protection masks or PFD masks derived from the corresponding ECC and ITU regulations, i.e.,ˆ FSC (θ) andˆ C/I (θ), are derived in detail in [24] and included in APPENDIX for ease of reference.
To ensure that the design of VDE-SAT meets both ECC and ITU requirements, the electromagnetic PFD irradiating the surface of the earth from the VDE-SAT satellite must satisfy the ultimate mask deduced from the two requirements, i.e., SAT where θ is the elevation angle, as plotted in Figure 4. It serves as the protection mask against the interference from the space station from the perspective of a victim land station. It guarantees the satisfaction of the ECC and ITU regulatory constraints in the sense that a VDES space station produces no more impairment on the performance of a victim land system than any other co-frequency land system would.
To conform to this PFD mask, the effective isotropic radiated power (EIRP) of the satellite space station based on the nadir offset angle, φ, as depicted in Figure 4, must satisfy SAT (φ) ≤ 1 4π where λ is the wavelength, and SAT Earth (φ) is the path loss between the VDE-SAT space station and a receiver on the surface of the earth at an nadir offset angle of where R is the radius of the earth, h the orbit altitude of the space station. Clearly, from (3), φ ∈ [0 • , 66 • ].

D. SPECTRUM SHARING
As aforementioned, VDE-TER is mainly for limited coastal areas, i.e., for nearshore deployments; its spectrum, L TER , U TER , is thus largely underutilized, leading to an inefficient and wasteful use of scarce VHF spectrum resources. To maximize the utilization of the extremely precious VHF spectrum, the VDES spectrum allocation allows VDE-SAT to utilize the VDE-TER spectrum (both upper and lower legs) as long as the transmission does not cause harmful interference to VDE-TER, in principle. But in practice, it is non-trivial for real-time coordination of a satellite space station with the shore stations in its field of view along its traveling path over different countries or regions. In the current design, we take advantage of the PFD mask concept to negate the need for such real-time coordination.
Recalling that the PFD mask in (1) ensures that the interference from the VDE-SAT downlink can be absorbed by the incumbent land systems; naturally, it should also hold true for VDE-TER. Under this concept, a VDE-SAT space station is free to transmit on the VDE-TER spectrum anytime and anywhere as long as the EIRP constraint specified in (2) is satisfied. Nonetheless, the transmission on the resources that are dedicated to the critical system broadcast signals of VDE-TER should be avoided at all times for the best protection of these signals, which signifies a unified air interface design for VDE-SAT and VDE-TER.
For VDE-SAT uplink, the mobile station may opportunistically use the VDE-TER spectrum for VDE-SAT uplink transmissions wherever the absence of the VDE-TER service can be assured (e.g., offshore). For that, an effective mechanism to facilitate such assurance must be embedded in the air interface design. An example of such design is depicted in Figure 5, where the presence of a VDE-TER system is indicated by a special ''beacon signal'' transmitted by a shore station. Only outside the beacon coverage is the VDE-TER spectrum available for VDE-SAT uplink transmissions.
The range of the beacon is extended beyond the VDE-TER cell coverage to create a ''buffer zone'' that protects the VDE-TER uplink, especially for those of mobile stations at the edge of the cell. The buffer zone should be sufficiently large such that the VDE-SAT signals on L TER seen by the shore station are below the noise level. For that, the VDE-SAT downlink transmission should avoid overlapping with the beacon signals to assure that the beacon is protected for maximum detectability like other system signals broadcast on the VDE-TER downlink.  (2); whereas for the uplink, the ''no-share zone'' is ''defined'' by a special beacon signal, within which no VDE-TER spectrum is allowed for VDE-SAT uplink transmissions. (b) Outside the ''no-share zone'' a ship is free to use both VDE-SAT spectrum and VDE-TER spectrum, {L SAT , L TER }, for uplink transmissions; whereas within the no-share zone, the VDE-TER spectrum is restricted from mobile stations for VDE-SAT communications.
A mobile station thus refrains itself from transmitting VDE-SAT signals on L TER whenever a beacon signal can be detected; instead, the dedicated VDE-SAT channels, L SAT , is preferred for VDE-SAT uplink transmissions if satellite communication is needed within the no-share zone. Apparently, the buffer zone diminishes as the VDE-TER cell radius becomes limited by the radio horizon, and the no-share zone coincides with the cell coverage, in which case, the reception of the transmission from the edge mobile station by the shore station is naturally protected by the horizon.
Indeed, the restriction of VDE-SAT uplink transmission on VDE-TER spectrum in the ''no-share zone'' protects the VDE-TER system. However, it does not mean that the VDE-SAT uplink transmission on VDE-TER spectrum in the ''share zone'' is free of ''interference'' from the VDE-TER uplink transmissions since a space station has a much larger field of view than a mobile station on earth. A space station is thus responsible for monitoring the interference on the VDE-TER spectrum, L TER , and scheduling the appropriate uplink resources as well as the modulation and coding scheme (MCS) for the uplink transmission, depending on the level of the interference that it sees on the uplink. Here, we see another benefit from the centralized resource control mechanism employed by VDE-SAT (cf. Section III B). This simple scheme provides a seamless spectrum sharing mechanism that minimizes the interference between these two systems, and maximizes the spectrum usage and overall spectral efficiency.
Direct ''mobile-to-mobile'' (e.g., ship-to-ship, buoy-toship) communication is allowed on the VDE-TER channels, L TER or U TER , in the absence of the VDES network (VDE-TER and VDE-SAT) coverage. The lower leg L TER is preferred to reduce the interference with the AIS/ASM channels, and more importantly, to protect the control station (shore or space station) transmissions, especially the system broadcast signals. To that end, direct communication on the VDE-TER or SAT spectrum can only be engaged when there is no evidence of control station transmissions, and the resources for broadcasting system control information are excluded from transmission at all times. Mobile stations monitor these resources and refrain from transmission whenever the broadcast signal is detected.

E. FDD VS. TDD
An important issue with the traditional FDD communication is the protection of the receiver from the co-located transmitter at each communicating end. The out-of-band emissions will enter into the receiver band, and affect the whole receive path, degrading its sensitivity. To prevent the out-of-band emissions of a transmitter from leaking into the co-located receiver, a full-duplex FDD transceiver relies on the RF filter coupled with spatial separation between the transmit and receive RF chains to bring down the interference to a manageable level. Since spatial separation is costly or physically difficult to achieve, most FDD systems mainly depend on frequency separation between the two paired frequency channels to ease the transceiver design, to the extent that the same antenna can be used for both transmit and receive with an RF duplexer. This is especially crucial for systems operating in the lower-frequency bands, e.g., the VHF maritime mobile band.

1) VDE-TER
The frequency separation of 4.5 MHz between the uplink and downlink, i.e., the lower leg L TER and the upper leg U TER , can be too tight to share the transmit and receive antenna for certain deployments, but is still manageable for shore stations and large ships. For certain low-cost maritime devices, the shore station has to make sure the device operating in halfduplex via careful scheduling with benefit of the centralized resource allocation framework.
However, for VDES and its peculiar spectrum allocation, the protection of the VDE-TER receiver is not the only issue; the protection of the legacy AIS (channels 2087 and 2088), as well as ASM (channels 2027 and 2028), is also of particular importance, since both are close to U TER (75 kHz apart from AIS and 50 kHz from ASM). For shore stations with colocated AIS transceivers, the transmission of VDE-TER on U TER will severely interfere with its reception of AIS and ASM signals under such tight frequency separation, which is more difficult to deal with than the case in FDD.
Specifically, assuming the receiver noise temperature is 30 dBK, the noise floor is thus −125 dBm (per 25 kHz). The required isolation between the VDE-TER transmitter and the AIS receiver is thus at least 80 dB for a 1 W transmitter and 91 dB for a 12.5 W in order to knock the interference down to the noise floor, assuming 75 dBc of attenuation within a 75-kHz stop band of the transmit spectrum.
Although such RF isolation can be achieved using highend RF filter systems combined with spatial separation of VDE-TER and AIS antennas (e.g., 10-m vertical separation may provide 50-dB isolation), there still could be scenarios where such a degree of isolation between VDE-TER and AIS transceivers co-located at a shore station is too costly. Due to this concern, the simplex channels of the lower leg L TER for VDE-TER uplink has been re-classified as ''duplex'' channels most recently to leave the time-division duplexing (TDD) option available for this scenario [see Figure 3 (b)], in which a frequency channel is divided into multiple time slots, and both downlink and uplink transmissions are duplexed onto the same frequency channel but in different time slots with the help of an RF switch. A shore station now at least has an option to use the lower leg, L TER , for both uplink and downlink transmissions in a TDD fashion to avoid downlink transmissions on U TER . The larger frequency separation between L TER and AIS greatly eases the isolation necessary to bring down the interference to the co-located AIS receiver to a more manageable level. However, the spectrum usage for VDE-TER is halved.
A natural question is then why not simply to use the lower leg, L TER , for downlink, and the upper leg, U TER , for uplink in the first place. The reason is that it would cause the same problem at the mobile stations, and would be even more difficult to manage, considering the diverse types of mobile stations and the fact that a low-cost transceiver for mobile stations is a default. As illustrated in Figure 6, the reception of the AIS/ASM signals could be interfered by the out-of-band emissions from the VDE-TER uplink transmissions in the dense traffic environment near shore, had U TER been allowed for uplink transmissions. Simply leaving the ''burden'' to shore stations is clearly a better choice. VOLUME 8, 2020 2) VDE-SAT Like VDE-TER, VDE-SAT also has the out-of-band emission issue. However, here, the downlink interference to the AIS and ASM reception (as seen in VDE-TER) is no longer an issue since they are not meant for satellite reception by design since the guard period of the transmission burst reserved for AIS and ASM for proximity communications is generally far from sufficient for absorbing the propagation time differences among mobile stations at the space station receiver. However, the FDD cross-link interference presents a more serious challenge for VDE-SAT than for VDE-TER.
As aforementioned, one important and difficult issue for FDD communication is the protection of the receiver from the transmitter at each communicating end. The 4.6-MHz frequency separation between the uplink and downlink, i.e., the lower leg (L SAT ) and the upper leg (U SAT ) in Figure 3 (a) may be fine for VDE-TER shore stations, but could be problematic for VDE-SAT space stations.
From Figure 4, the space station transmit power can be up to 26 dBm/25 kHz − 8 dBi = 18 dBm/25 kHz, more than 70 dB of isolation at a 4.6 MHz separation is thus needed in order to bring the out-of-band emissions from the output of the transmitter down to the noise floor, assuming the noise temperature is 26 dBK/25 kHz. This requires a transceiver to provide stringent RF isolation via high-performance VHF filter systems and/or spatial separation of transmit and receive antennas, which could very well be practically difficult for both payload-limited and dimension-limited LEO space stations (e.g., 16 kg and 20 × 27 × 42 cm), especially in the 160-MHz frequency band. So, realistically, only half-duplex FDD can be supported for VDE-SAT, in which the space station and mobile stations take turns to transmit on the downlink and uplink frequencies, i.e., U SAT and L SAT , respectively, which negates the need for a transceiver to transmit and receive simultaneously. Although the leakage from the transmitter to the receiver is avoided, the duty cycle per channel is halved, and the spectral efficiency is, in effect, cut down to half, which is detrimental to VDE-SAT whose bandwidth is already thin.
A viable solution to circumvent this cross-link selfinterference without sacrificing the spectral efficiency is to employ the TDD transmission technique. With the time separation provided by TDD, the transceiver of a station never needs to transmit and receive at the same time, but still maintains full channel utilization (almost), which maximizes the spectral efficiency and hence system capacity, and yet allows reusing or ''time-sharing'' the RF resources of a station, such as the antennas, filters, mixers, frequency sources and synthesizers. This means a significant reduction in complexity and cost, although in this particular case, two sets of RF frontend filters/amplifiers may be needed for the paired (FDD) bands, which is not natural for TDD.
This concern was initially raised in [24], and was brought to the attention of ITU. The simplex allocation plan in Figure 3 (a) has since been revised. Both lower and upper legs of VDE-SAT, L SAT and U SAT , as well as L TER and U TER when used by VDE-SAT, are now classified as duplex channels as depicted in Figure 3 (b), allowing both uplink and downlink transmissions in these channels for optional TDD transmissions [32].
The particular advantage of TDD is the great simplification of out-of-band interference isolation. Another advantage is the extra degree of freedom for the network to allocate communication resources in proportion to the traffic demand in both directions by varying the time partition of the uplink and downlink transmissions to better address the ''service-centricity'' requirement for heterogeneous maritime IoT applications and services. Nonetheless, the downside is that TDD complicates the interference management, which requires the synchronization of uplink and downlink between stations as well as between VDE-SAT and VDE-TER. Another caveat is that the uplink transmission (from mobile stations) on the upper leg, U SAT , will interfere with AIS and ASM receptions, although it is less of a concern since VDE-SAT is meant for offshore communication in open oceans where maritime traffic is sparse. For nearshore or for mobile stations with co-located AIS/ASM transceivers, the lower leg of VDE-SAT L SAT is preferred for VDE-SAT uplink transmissions. However, the legacy LAIS reception at the space station (if so equipped) may now be compromised during the downlink transmission period on L SAT .
Overall, despite these imperfections in spectrum allocation, including very limited bandwidth, the VHF frequencies are ideal for maritime MTC in terms of transmission range, which also translates to potential power saving for energy or power-limited maritime devices (battery-powered or energyharvesting maritime sensors, for instance).

V. CONCLUSION
The maritime IoT concept is based on the harmonization of marine navigation information, and supporting continuous access to maritime services during the journey of a maritime mobile station. At the heart of this concept is the maritime MTC technology and the radio spectrum that have been long overdue. Nonetheless, the unique maritime environments and maritime IoT service requirements pose serious challenges to the MTC, including the maritime communication community and ITU. This paper addresses the key challenges from the perspective of system architecture and radio spectrum. Specifically, a maritime MTC system architecture is presented, and the recently internationally-allocated maritime MTC spectrum under this maritime MTC architecture is analyzed. With the proposed unique spectrum sharing and interference management approaches, the communication components that the underlying architecture encompasses complement each other, and jointly, they provide a solid foundation for a truly fully-fledged MTC system for maritime IoT. This architecture, coupled with the spectrum framework, further signifies a unified air interface design for VDE-SAT and VDE-TER, which has not yet been seen in most design proposals so far. The insight into the maritime MTC architecture and spectrum provided in this paper will surely help the design and standardization of VDES. While still in its early stage of development, the successful allocation of the international maritime MTC spectrum combined with the culmination and synergy of various technological efforts has brought the maritime IoT concept one step closer to reality.

ACKNOWLEDGMENT
The authors would like to thank Miss Emily Wang (University of California in Santa Cruz) for her help with the data analysis, especially the drawing of the figures, during her internship with Southeast University.

APPENDIX PFD MASK DERIVATION A. FIELD STRENGTH CONSTRAINT (FSC)
The ECC recommendation T/R 25-08 defines a field strength threshold for interference coordination between land mobile systems from neighboring countries in the frequency band from 29.7 to 470 MHz [29]. Five indicative thresholds with a reference bandwidth of 25 kHz are specified, each of which indicates the average interference field strength, E, impinging on a land station antenna allowed from an interfering land station. It translates to an electromagnetic PFD value, via Maxwell's equations, where σ and µ are the electric and magnetic constants, respectively. For the VDES frequency range from 156.7625 to 162.0375 MHz, the corresponding field strength E is 12 dB(µV/m) per 25 kHz. For systems operating below the 15 GHz band, the reference bandwidth of 4 kHz is appropriate when considering the impact of unwanted signals at the input of land/terrestrial station receivers of an ITU hypothetical reference circuit [33]. Following this convention, the corresponding field strength is thus E = 12dB µV/m 25 kHz = 4 dB µV/m 4 kHz (5) or E = 4dB (µV/m) per 4 kHz, which translates to a PFD value of ψ = −112dBm/m 2 per 4 kHz from (4). The corresponding interference into the land station receiver depends on the antenna gain, which can be shown to be where λ is the wavelength of the interference signal, ι the receiver feeder loss, and G land (θ) the antenna gain of receiving antenna for a given incident angle θ. An average antenna gain pattern (relative to an isotropic antenna) for both land mobile and base stations is provided in [31] and [34], and is plotted in Figure 4. The resultant interference that the victim (i.e., a land station receiver) actually sees depends on the incident angle, θ, of the interfering electromagnetic wave. Consequently, the largest interference from a land system into the victim land system receiver permitted by FSC is given bŷ Applying the same criterion to a VDE-SAT station indicates that the PFD from a VDE-SAT space station must be less than where ϑ SAT land is the antenna polarization loss between the VDE-SAT space station transmitter (circular) and the land station receiver (vertical), typically 3 dB [35], and is the normalized land mobile antenna gain. However, differently from the land system case (in which θ ≈ 0 • since land service transceivers typically direct to the horizon), the incident angle can be anywhere from 0 to 90 degrees, i.e., θ ∈ [0 • , 90 • ]. Equation (8) indicates that, for VDE-SAT to comply with the FSC, the PFD that a VDE-SAT space station impinges upon the land systems must be less than FSC (θ).
Substituting G land (θ) in (8) with the respective antenna gains of the land mobile station and base station in Figure 4, i..e.,G mobile (θ) and G base (θ), obtains cases for land mobile stations, i.e., FSC mobile (θ) and base stations, FSC base (θ), respectively. The maximum allowed PFD irradiating the earth's surface by a VDE-SAT satellite inferred from the FSC of ECC is thuŝ

B. CARRIER-TO-INTERFERENCE CONSTRAINT (C/I)
The criterion used by ITU for protection between land mobile systems is from the perspective of the ultimate system performance requirements. For digital land mobile systems in the frequency band between 138 and 174 MHz, a bit error rate of 2-5% with 4-level FM (C4FM) modulation is targeted as specified by ITU [31]. The corresponding required (or minimum) per bit energy-to-interference plus noise power spectral density ratio can be found in [30] as where N 0 (in W/Hz) is the interference plus noise power spectral density. Since C4FM modulation carries two bits per symbol, it translates to a minimum carrier-to-interference plus noise ratio of where R b is the bit rate per Hz (bits/s/Hz) and R s the per Hz modulation symbol rate. Apparently, (11) is met as long as where C is the received per channel carrier power, N 0 is the noise power of the receiver system, and N 0 + I is the per channel noise plus interference power. The performance of an analog system is commonly measured by the ratio of the total received power to the unwanted power (SINAD). ITU recommends a typical SINAD value of 12 dB for establishing degradation protection for analog land mobile systems [31], i.e., or where C D is the signal to distortion ratio, typically in the vicinity of 20 dB [36], and ς a = 12 dB. According to the representative parameters of technical and operational characteristics of conventional and trunked land mobile systems operating in the frequency band 138 -174 MHz listed in Table 1, the typical antenna height of a base station and a mobile station are 65 meters and 2 meters, respectively, which determines the maximum line-of-sight distance (limited by the visual horizon) between the base station and mobile station, i.e., 29 km + 5 km = 34 km. This leads to a maximum free space loss ofˆ ≈ 107dB in the frequency band of 156.7625-162.0375 MHz (cf. Figure 3). Plus an additional path loss which is estimated as ≈ 34dB (land path) [37], giving rise to a maximum total path loss of L land =ˆ + ≈ 141 dB.
Further taking into account the receiver feeder loss ι, the minimum carrier power received by the land station receiver, i.e., the sensitivity is where land (θ TX ) is the minimum EIRP of the land transmitter and G land (θ RX ) is the land receiver antenna gain, both at an elevation angle of θ TX ≈ θ RX ≈ 0 • , recalling that land transceivers direct to each other at near 0 • elevation angle (the horizon). Note that the EIRP is the product of transmit power and transmit antenna gain (including the feeder loss). Given the 15 kHz channel bandwidth and the noise figure ξ of 7 dB of the legacy land stations (cf. Table 1), the system noise level is N 0 = κT ξ = −125 dBm 15 kHz, where κ is the Boltzmann's constant (i.e., −199 dBJ/K), and T = 290 K is the receiver noise temperature. For the digital land systems, the corresponding minimum carrier to interference ratio is and the maximum allowed interference level follows aŝ which is −104 dBm per 15 kHz and −118 dBm per 15 kHz for land mobile and base stations, respectively. Similarly, for the analog land systems, we define and the maximum allowed interference per channel is there-foreÎ which is −109 dBm per 15 kHz and −117 dBm per 15 kHz for land mobile and base stations, respectively. Combining the digital and analog scenarios, we have the maximum permitted interference caused by an interfering land station,Î i.e., −109 dBm per 15 kHz for a victim mobile station, and −118 dBm per 15 kHz for a base station. If we take into consideration the polarization loss between the space station antenna and the land station antenna, for the case where the interfere is a satellite space station, the PFD constraint inferred from (11) and (14) is therefore C/I (θ) = 4π λ −2 ιϑ SAT landÎ C/I G −1 land (θ) .
The PFD governed by (11) and (14)  TINGTING XIA is currently pursuing the Ph.D. degree with the Nanjing University of Science and Technology, Nanjing, China. She is also with the Wireless Networking and Mobile Communications Group, School of Electronic and Optical Engineering. Her current research interests include wireless communications and signal processing.
LEI WANG is currently pursuing the Ph.D. degree with the Wireless Networking and Mobile Communications Group, School of Electronic and Optical Engineering, Nanjing University of Science and Technology. His current research interests include wireless communications and signal processing.