The Autonomous Platforms Inertial Dataset

One of the critical tasks required for fully autonomous functionality is the ability to achieve an accurate navigation solution; that is, to determine the platform position, velocity, and orientation. Various sensors, depending on the vehicle environment (air, sea, or land), are employed to achieve this goal. In parallel to the development of novel navigation and sensor fusion algorithms, machine-learning based algorithms are penetrating into the navigation and sensor fusion fields. An excellent example for this trend is pedestrian dead reckoning, used for indoor navigation, where both classical and machine learning approaches are used to improve the navigation accuracy. To facilitate machine learning algorithms’ derivation and validation for autonomous platforms, a huge quantity of recorded sensor data is needed. Unfortunately, in many situations, such datasets are not easy to collect or are not publicly available. To advance the development of accurate autonomous navigation, this paper presents the autonomous platforms inertial dataset. It contains inertial sensor raw data and corresponding ground truth trajectories. The dataset was collected using a variety of platforms including a quadrotor, two autonomous underwater vehicles, a land vehicle, a remote controlled electric car, and a boat. A total of 805.5 minutes of recordings were made using different types of inertial sensors, global navigation satellite system receivers, and Doppler velocity logs. After describing the sensors that were employed for the recordings, a detailed description of the conducted experiments is provided. The autonomous platform inertial dataset is available at: https://github.com/ansfl/Navigation-Data-Project/.


I. INTRODUCTION
A Fully autonomous platform requires the ability to achieve accurate navigation; that is, to determine position, velocity and orientation. To that end, depending on the vehicle operating environment (air, sea, or land), various sensors are employed to achieve this goal. Most unmanned aerial vehicles (UAV), also known as drones, apply fusion between an inertial navigation system (INS) and other external sensors such as global navigation satellite systems (GNSS) [1], [3] or vision [4], [5]. Similarly, GNSS/INS fusion is also used in unmanned ground vehicles (UGV) as shown in [9], [10]. Autonomous underwater vehicles (AUV) mostly employ an INS and a Doppler velocity log (DVL) [6], [7]. In autonomous surface vehicles (ASV), localization is performed using extended Kalman filter (EKF) based simultaneous localization and mapping (SLAM) by fusing inertial sensors with acoustic information without prior information of source locations, as shown in [8], [11]. In parallel to the development of novel navigation and sensor fusion algorithms, machine-learning (ML) based algorithms are penetrating into the navigation and sensor fusion fields. An excellent example of this trend is pedestrian dead reckoning (PDR), used in indoor navigation, where both classical and machine learning approaches are used to improve the pedestrian accuracy. [12] employed machine learning classification algorithms to recognize a smartphone's position (talking, texting, swing, or pocket), thereby enabling the choice of a proper gain value to improve PDR positioning accuracy. [19] segmented inertial data into independent windows and formulated an optimization problem. Deep recurrent neural networks, IONET, achieved highly accurate trajectories, outperforming state-of-the-art classical PDR techniques on a wide range of tests and attachments. [20] and [21] applied deep learning (DL) and data driven techniques to improve pedestrian inertial navigation. Deep learning approaches were also applied to estimate user step lengths instead of using traditional PDR approaches as in [13], [14], [23]. [15] compared the Weinberg gain estimation (WG) approach with deep learning approaches. Another example is described by [16], who applied a deep-learning pedestrian dead reckoning framework for smartphone location recognition classification followed by a change of heading and distance regression network. Research of ML and DL algorithms, as by [17], requires a huge dataset. In particular, to facilitate the derivation and validation of a machine learning algorithm for autonomous platforms, a huge quantity of real recorded data is needed. To cope with this increasing demand alongside the advances in pure inertial navigation, several datasets, including pedestrian odometry datasets, were recorded and made publicly available. The datasets were recorded using various platforms and recording devices. For example, a set of inertial sensors that were mounted on different body parts recorded different types of activities, provided in [18]. They provide ground truth pose, user activity, and device mode. The RIDI dataset [21] recorded inertial sensor measurements from a mobile device along with 3D motion trajectories. The recordings were of people walking and standing while holding the mobile device, while the mobile device was in their pockets. The ADVIO dataset by [22] consists of a full rig of devices recording both raw inertial data and images, and generated positions by running several SLAM implementations. The RONIN dataset by [20] contains 42.7 hours of inertial sensor recordings and ground truth recorded by 100 human subjects. For the recording, a 3D tracking phone was harnessed to the body so the subjects could handle their phones freely and naturally. A recent dataset, OxIOD, created by [23], presented 42km of high precision accuracy human motion captured along with inertial recordings from several mobile phones. The KITTI dataset [24] pioneered the field of autonomous driving for land vehicles and made a strong impact. The data was collected on a vehicle with video cameras, a 3D laser scanner with a Global Positioning System (GPS) and an INS. Later, the RobotCar dataset from [25] was published with 1000km of GPS and INS recordings of an autonomous electric vehicle in Oxford. ApolloScape by [27] is a much larger and richer dataset that features a sensor fusion scheme integrating camera videos with consumer-grade motion sensors (GPS/IMU). Recently, an autonomous navigation dataset, A2D2 dataset from [29], provided IMU and GPS recordings from a land vehicle. One of the latest inertial odometry benchmark datasets is the IO-VNBD dataset from [30], generated to create a common publicly available baseline for vehicle positioning. The dataset was recorded on a research vehicle equipped with inertial sensors, a GPS, wheel odometry, and also a smartphone with its sensors. The NCLT dataset from [26] consists of 147km repeating Segway trajectories recorded at the University of Michigan. The sensors in the dataset are GPS, IMU, and laser scans. The Euroc MAV dataset by [28] provided inertial and ground truth trajectories and orientations, recorded using an aerial vehicle. For marine applications, [31] provided visual data from a stereo camera, IMU recordings, and range data acquired by a mechanical scanning sonar sensor, recording underwater. Table 1 summarizes these datasets, giving the recording sensors, vehicle types, and duration for each dataset. To advance the development of accurate autonomous navigation, this paper shares the results of collecting and analyzing the raw data and corresponding ground truth trajectories from the inertial sensors. Our goal is to provide an inertial dataset for different types of autonomous platforms that were recorded using the same measurement equipment (when possible). To that end, the dataset was collected using a variety of platforms including a quadrotor, two autonomous underwater vehicles, a land vehicle, a remote controlled electric car, and a boat. This paper provides two different AUV datasets to facilitate underwater based navigation research. In addition, a unique stationary dataset is also provided, consisting of two different IMUs. This dataset can be used to derive machine learning approaches for stationary coarse alignment, as in [32], and other applications. The recordings were made using different types of inertial sensors, a global navigation satellite system, and a Doppler velocity log (DVL) including a Teledyne navigation DVL and Teledyne Piston DVL, Inertial Lab MRU, Vectornav VN-100, analog devices IMU, a Pixhawk Cube flight controller, and three types of smartphones. The autonomous platform inertial dataset is available at: https://github.com/ansfl/Navigation-Data-Project/. In comparison with other published datasets (Table 1), the contributions of our dataset are: 1) A special mechanical setup, allowing aligning a smartphone and our MRU device, was constructed for the purpose of data collection. In that manner, the MRU provides the ground truth (GT) trajectory and its IMU readings, while the smartphone provides its low-performance IMU measurements. Thus, two different IMU grades are available in the dataset. This setup was used in the stationary, land vehicle, and boat recordings. 2) Two different AUVs datasets including GT trajectories and IMU measurements in various dynamics and trajectories. 3) Quadrotor dataset including GT trajectories and two IMU readings. This dataset also includes unique periodic motion trajectories.
The rest of the paper is organized as follows: Section II describes the sensors that were employed for the recordings, Section III provides a detailed description of the experiments, and Section IV gives the conclusions.

II. MEASUREMENT EQUIPMENT
Different types of inertial sensors, GNSS receivers, and two DVLs were used to collect the data. A Pixhawk Cube flight controller was used for the quadrotor platform recordings. An MRU recording setup was used for the static recordings, marine vessel motion recordings, and car recordings. Two types of DVLs were used for two AUVs platform recordings: the Snapir A18D AUV and ALICE AUV. Several smartphones (with different IMUs) were a part of the recording equipment for the stationary recordings, marine vessel motion recordings, and remote control car recordings.

A. TELEDYNE NAVIGATOR DVL 1) Teledyne Navigation DVL:
The Snapir AUV (Section III-E) is equipped with a Teledyne RDI Work Horse navigator DVL [34], as shown in Figure  1. It operates at frequencies of 300KHz, 600KHz, and up to 1200KHz, which allows bottom tracking at depths of 0.5m and down to 200m. The DVL is able to measure velocity with three degrees of freedom. The velocity range is ± 10m/s at a measurement resolution of 0.001 and accuracy of 0.008m/s. The sampling rate of the DVL is 1Hz. The Alice AUV (Section III-F) employs the piston version of the Teledyne Explorer DVL with a standard acoustic frequency of 614.4KHz. The DVL operates in tracking mode at a depth of 0.5 to 66m with an accuracy of 0.01m/s and water profiling at 1.33 to 25m with an accuracy of 2.3 m/s. The measurement resolution is 0.1 and the sampling rate is 4Hz.

B. INERTIAL NAVIGATION SYSTEM 1) Inertial Lab -MRU:
The motion reference units (MRU) by Inertial Labs is a high-performance strapdown motion system [33]. It also has GNSS RTK receiver, which is used to obtain ground truth in cm level accuracy.The sampling rate of the MRU is 100Hz, the accelerometer bias in run stability is 0.005mg and the gyroscope bias in run stability is 1deg/hr.

2) Vectornav -VN-100:
The VN-100 is a miniature, high performance IMU and Attitude Heading Reference System (AHRS). Combining three-axis accelerometers, gyroscopes and magnetometers, a barometric pressure sensor, and a 32-bit processor, the VN-100 provides high-rate, calibrated IMU data and a real-time 3D attitude solution that is continuous over the complete 360 degrees of motion [35]. The accelerometer bias in run stability is smaller than 0.04mg and the gyroscope bias in run stability is 5 − 7deg/hr. The Alice AUV employs an ADIS16488A IMU. The IMU features a triaxial digital gyroscope with a dynamic range of 450 • /s, a triaxial accelerometer with a range of ± 18g, and a triaxial magnetometer with a range of ± 2.5 gauss. The sampling rate of the IMU is 123Hz.

4) Pixhawk Cube:
Pixhawk Cube [Prificnc, 2021] is a flight controller with a triple redundant IMU system that is isolated, dampened, and temperature-controlled, thereby reducing noise to measurements and allowing flights at extreme temperatures. Also, it is RTK GNSS ready. Hence, it is very efficient when data collection is required, since each flight provides three IMU measurements with a sample rate of 1000Hz, and ground truth can be extracted from the RTK GNSS system. Pixhawk Cube could be used as the main flight controller of the quadrotor, but was used herein as a recording device for its IMU recordings and its GPS RTK.

C. SMARTPHONES 1) Huawei P40 smartphone:
For the recordings in stationary conditions and land vehicle the Huawei P40 smartphone [36] was used. Three-axis accelerometer and gyro data was collected at a rate of 200Hz.
2) Samsung Galaxy S7 Edge smartphone: Samsung Galaxy S7 Edge [37] was employed for the marine vessel motion recording (Section III-C). The recorded data consists of the three-axis accelerometer and gyro data at a rate of 100Hz.

3) Samsung Galaxy S8 Edge smartphone:
Samsung Galaxy S8 Edge [38] was used for the remotecontrol car recordings. The recordings include the data from the IMU (without the smartphone OS post-processing) -that is the accelerometers and the gyroscopes, at a rate of 100Hz.

A. STATIONARY CONDITIONS
A mechanical setup was designed and manufactured to mount the MRU device (Section II-B1) together with a Huawei P40 smartphone (see Section II-C1), as shown in Figure 3. The smartphone stayed aligned with the MRU reference frame during the recordings. Using this setup, samples with varying lengths of stationary recordings were taken. Each recording consists of the MRU IMU data and the corresponding P40 IMU data so that the MRU serves as ground truth data due to the high accuracy of the sensors. The total duration of the stationary recordings is 60 minutes Recordings were made in different attitudes.

B. LAND VEHICLE
Several land vehicle recordings were conducted using the MRU and Huawei P40 smartphone with the unique setup described in Section 3.1. The unit was placed in a modified Suzuki Jimny (overall length 3645 mm, width 1645 mm, height 1725 mm, and wheelbase 2250 mm) between the front car seats (X=700 mm Y=2000 mm Z=1000 mm relative to the left corner of the front bumper), as shown in Figure 4. The first recording consists of a 25 min stationary recording (with the engine on) and 35 min of a driving recording from the Israel Oceanographic and Limnological Research (IOLR) facility (32°49'34"N 34°57'24"E, sea level altitude) to Isfiya on Mount Carmel (32°43'02"N 35°03'54"E, altitude 472m), as shown in Figure 5. Several points of interest were documented during the drive, as shown in Table 2, which can help to evaluate the performance under common driving conditions. Another recording was made from Haifa's

C. MARINE VESSEL
Another platform that contributed to our dataset is a marine vessel named "Shikmona", shown in Figure 6. It has an overall length of 38.9 m and width of 9.2 m. The marine vessel once belonged to the Israel Navy, and was purchased by the IOLR in 2015 for marine research purposes. IMU signals using both the MRU (Section II-B1) and the Samsung Galaxy S7 edge smartphone (Section II-C2) was recorded. The setup (see Figure 3) was mounted on the marine vessel (the exact location is 18m from the front and 5 m from the right, shown in Figure 8 with a yellow dot), so only the motion of the marine vessel on the water was recorded. Four recordings of 20 min of straight line motion were conducted with a 90-degree rotation to the right between each repetition, thereby creating a square pattern. As in the land vehicle and stationary conditions, here, also, two different types of IMU are available for evaluation.  controller (see Section II-B4) was attached to the quadrotor (shown in Figure 10). It has three IMU units that were used for recording during flight. In some of the recordings the Pixhawk Cube was attached to the quadrotor's landing gear An RTK GPS receiver was used to obtain the ground truth of the trajectories. In all the experiments, the quadrotor was flown manually by an experienced and licensed pilot. The quadrotor was flown with various dynamics, each for a specific purpose. To begin with, the quadrotor was flown in a straight line back and forth. Next, the quadrotor was flown with a periodic motion that resembles a sine wave; both vertically, where the altitude changed relative to the nominal altitude, and horizontally, where the altitude is fixed and the sine wave is horizontal. An example of such a trajectory can be seen in Figure 11. Finally, the quadrotor's altitude was 1) Based on the periodic motion trajectories included in the dataset, we proposed a framework for quadrotor navigation based only on inertial sensors, called quadrotor dead reckoning (QDR) [2] as an alternative solution to classical INS. Motivated by the PDR approach, the motion of a walking user was emulated, instead of moving in straight lines in situations of pure inertial navigation. In this manner, similar to step length estimation in PDR, we estimated the peak-topeak change in distance of the quadrotor. 2) The figure-eight trajectory was used to show the benefits of a criterion that acts as a guideline for a reasonable choice for the step size in Kalman filtering (KF) as by [42]. In KF, a trade-off exists when selecting the filter step size. Generally, a smaller step size improves the estimation accuracy, yet adds the cost of a high computational load. The proposed criterion mitigates this trade-off influence on performance.

E. SNAPIR A18D AUV
The Snapir is a modified A18D AUV manufactured by ECA robotics, shown in Figure 13. It was delivered to the Hatter Department of Marine Technologies, University of Haifa, in 2017. The dataset was collected during an expedition in the Mediterranean sea. A measurement rate of 40Hz is used. The VN-100 is placed inside a case as shown in Figure 2 to protect it from water up to 100m. The case is placed inside the AUV. The marine vessel that deployed the AUV into the water was the "Shikumona", (shown in Figure 6). The Teledyne navigation DVL (Section II-A1) is mounted at the center of the AUV, and the Vectornav IMU (Section II-B2) is mounted at a distance of 3.55m to the front, 0.025m to the left, and 0.3m higher relative to the DVL, as presented in Figure  14. The AUV performed maneuvers with various dynamics: straight line, spiral with small ovals, and rounded-edge triangles. These shapes were created due to the complexity of the experiment that involved a long baseline acoustic positioning system (LBL) that resulted in a snail shell shape in most routes, as shown in Figure 15. The challenging snail and the number of samples collected is 24,085, totalling 401 minutes of raw DVL data. The Vectornav VN-100 was placed at the front of the AUV, operating at 40Hz. The Vectornav dataset includes 1, 480, 000 samples, equalling 616 minutes of recorded inertial data. The duration of the IMU is bigger than the DVL as due to operational constraints its recording began earlier.
This dataset was used in [45] to evaluate a novel approach for compensating for partial DVL outages by applying deeplearning methods in situations where only three beams are received by the DVL. The deep neural network uses past DVL measurements to predict the missing beam and improve the velocity vector accuracy.

F. ALICE AUV
The Alice is a modified SPARUS II AUV, [43], with hovering capabilities is illustrated in Figure 16. The AUV was developed by [44] and Alice AUV underwent sea trials off the coasts of Sdot Yam and Haifa Bay and two datasets were obtained. The vehicle motion during the experiments was measured by the onboard sensors, where the velocities were measured by DVL at a sampling rate of 4Hz. The depth of the AUV was measured by a pressure sensor. Both the IMU and the pressure sensor sampling rates were 123Hz. The first dataset recorded 1200 s of a way-point navigation mission along straight transects in various surge speeds, depths, and headings, as detailed in Table 3. The measured path is illustrated in Figure 18. The second data set recorded 3000 s of a way-point navigation mission along straight transects in a lawnmower pattern. This mission pattern is mainly employed for seabed survey missions. The AUV was programmed to maintain a depth This dataset was used in [46] to validate an algorithm

G. REMOTE-CONTROLLED CAR
A remote-controlled car-the Storm Electric 4WD Climbing car-with Samsung's Galaxy S8 smartphone (Section II-C3) mounted on it (shown in Figure 19) was used to create the dataset. The car dimensions are 385x260x205 mm, with a wheelbase of 253 mm and a tire diameter of 110 mm. The car was driven in three indoor courses: a straight line with a length of 6.3 m, a square with side lengths of 2.25 m each, and a circle with a diameter of 1.98m. The courses were marked on the lab floor, as shown in Figure 20, and were followed so that the wheels were on either side of the line. For each course, 15 recordings were taken. We The square and circular courses used for the recordings made a trade-off between driving fast enough to be able to overcome the noise of the sensors, and being able to drive on the courses accurately. In the circular and square courses, the end positions were at the same spot as the start positions with a deviation of a few centimeters due to the steering difficulties. Each recording has about three seconds of stationary recording at the beginning and the end of the trajectory. All the recordings were taken in an air-conditioned room at a temperature of 25 degrees. The beginning of the motion is clearly noticeable by a steep growth in the accelerometer's x axis. The end of the motion is also distinguished by a steep increase of the acceleration in the negative direction due to the sudden stop. The smartphone was attached to the top of the car with the screen towards the ceiling using Velcro, such that the x axis is aligned with the car's heading angle. It was tilted up to 2 degrees in the x and y axes when the wheels point forward.

IV. CONCLUSIONS
In this paper, our goal is to advance the development of accurate autonomous navigation by publicly sharing inertial sensor raw data and corresponding ground truth trajectories, and making it accessible and easy to use. The dataset was collected using a variety of platforms and measurement equipment in a variety of environments (air, sea, and land) while applying different trajectories and different motion dynamics. We believe that this dataset will be highly useful for research purposes in the areas of robotics and autonomous navigation. We plan to keep the dataset constantly updated by continuously collecting data with the same and additional platforms for a variety of purposes. A summary of the dataset is presented in Table 4. The data is available and can be downloaded at: https://github.com/ansfl/Navigation-Data-Project/.
In addition, we aim to increase continuously the amount of data with more planned experiments.