New Generation Hyperspectral Data From DESIS Compared to High Spatial Resolution PlanetScope Data for Crop Type Classification

Thoroughly investigating the characteristics of new generation hyperspectral and high spatial resolution spaceborne sensors will advance the study of agricultural crops. Therefore, we compared the performances of hyperspectral Deutsches Zentrum fur Luftund Raumfahrt- (DLR) Earth Sensing Imaging Spectrometer (DESIS) and high spatial resolution PlanetScope in classifying eight crop types in California's Central Valley during the 2020 growing season. The DESIS sensor onboard the International Space Station collects data at 235 hyperspectral narrowbands (HNB) each with 2.55 nm bandwidth from 400–1000 nm and 30 m spatial resolution. In contrast, PlanetScope Dove-R data have four multispectral broadbands (MBB) with 3–4 m spatial resolution. We obtained best classification accuracies using 14 DESIS HNB from the August 2020 image, with an overall accuracy of 85% and producer's and user's accuracies of 72–100% and 75–100%, respectively, for the eight crops. The best classification accuracies using PlanetScope data were obtained using an image mosaic pair from June and August 2020; this resulted in an overall accuracy of 79% and producer's and user's accuracies of 56–100% and 61–100%, respectively. Combining the best 14 DESIS HNB from August 2020 with the 4 PlanetScope MBB from August 2020 yielded an overall accuracy of 82% and producer's and user's accuracies of 65–100% and 60–94%, respectively. On one-to-one single date comparisons of DESIS versus PlanetScope data, the hyperspectral data always outperformed high spatial resolution data in crop type classification. Nevertheless, high spatial resolution data will remain invaluable in assessing within-field variability and crop biophysical/biochemical modeling in precision agriculture.


I. INTRODUCTION
M ONITORING agricultural areas helps address issues of global food and water security by allowing us to estimate crop productivity and crop water productivity [1]. Remote sensing has been used extensively for studying agriculture and other vegetation [2], [3], [4], [5]. However, crop type classification can be challenging in situations where fields are small and/or have multiple crops [6]. Additional challenges include large withincrop variability across management regimes and agroecological zones and small across-crop variability due to similarities in plant characteristics [6]. In such cases, hyperspectral data have improved crop classification [7] by better estimating plant biochemical/biophysical characteristics [8] and function [9]. There have been many advances in satellite-borne sensors over the past 50 years, from the Advanced Very High Resolution Radiometer (AVHRR) [10] to the Moderate Resolution Imaging Spectroradiometer (MODIS) [11], to the current Landsat 9 [12] and Sentinel-2 [13] sensors. A review of the use of old-generation hyperspectral remote sensing for agriculture has been documented in [2], [3], [4], and [5], including the use of different platforms (field-based, airborne, and satellite-borne) and different methods (full spectral analysis, optimal band selection, feature extraction, and machine learning) [14]. For example, EO-1 Hyperion was used for various agricultural applications including differentiating crop varieties and types [14].
The advances in spatial and spectral resolutions opened new avenues of investigation [15]. Now we are in a new era of remote sensing with hyperspectral sensors such as Deutsches Zentrum fur Luftund Raumfahrt (DLR) Earth Sensing Imaging Spectrometer (DESIS) [16], PRecursore IperSpettrale della Missione Applicativa (PRISMA) [17], Environmental Mapping and Analysis Program (EnMAP) [18], and the upcoming Surface Biology and Geology (SBG) mission [19]. In addition, we have high spatial resolution sensors like PlanetScope [20]. These increases in resolution will facilitate agricultural crop type classification [15].
There have also been substantial advances in technologies such as cloud-computing and techniques such as machine/deep learning [21] and advanced data fusion [22], [23]. For example, the authors used EO-1 Hyperion data in the Google Earth Engine cloud-computing platform to classify agricultural crops using the support vector machines and random forest machine learning algorithms [24]. In addition, the authors compared the use of EO-1 Hyperion and DESIS images with several machine learning algorithms to differentiate crop types in GEE [25].
While hyperspectral and high spatial resolutions are separately useful, data fusion can create more comprehensive and informative datasets [22], [26]. For example, the fusion of PlanetScope and Sentinel-2 data has yielded more accurate This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ estimation and detection of within-field yield variability [27], daily monitoring of crops [28], land cover classification [29], crop classification [30], and wheat leaf area index (LAI) prediction [31]. Similarly, Delalieux et al. [32] fused airborne hyperspectral (Airborne Prism EXperiment-APEX) and high spatial resolution (Remotely Piloted Aircraft Systems-RPAS) data to detect water stress. The fusion of hyperspectral DESIS and high spatial resolution PlanetScope data have the potential to increase crop type classification accuracies.
The large number of PlanetScope cubesats allows for a high temporal resolution and global daily coverage [48]. However, each sensor has its own relative spectral response function, resulting in variations across cubesats and scenes [28], [46], [49], [50]. There are ways to correct for these inconsistencies. For example, Rafif et al. [51] used histogram matching (with R [52] packages "raster" [53] and "RStoolbox" [54]) to correct for varying radiometric ranges across PlanetScope images. Images can also be radiometrically corrected through a linear shift from comparison with Landsat or Sentinel data, with the Multivariate Alteration Detection (MAD), or the Cubesat Enabled Spatio-Temporal Enhancement Method (CESTEM) [49]. However, the images need to be geometrically matched before leveraging images from other sensors for the correction, since PlanetScope images may have geometric errors of 4.8 to 19 m [49].
The high temporal and spatial resolutions of PlanetScope data have given them an advantage over other sensors in various applications. For example, Mansaray et al. [55] found PlanetScope provided greater accuracies than Landsat-8 and Sentinel-2 in estimating Chlorophyll a levels in small water bodies, and along the edge of a large reservoir where harmful algal blooms are most concentrated. The high temporal resolution of PlanetScope data also allowed for monitoring of algal bloom events and small changes in water quality [55]. In addition, Plan-etScope images have been used for several agricultural applications, including monitoring crop growth, management, health, and productivity [48], [50], [51], [56], [57], [58], [59], [60], [61], [62].

C. California's Central Valley
Agriculture is the largest water consumer of any sector in the U.S. Approximately 74-93% of all blue water (net surface and ground water withdrawn that is evaporated or incorporated into a product) is used for irrigated agriculture and livestock production [63], [64], [65], [66]. Although California's Central Valley only encompasses about 12% of the surface area of California and composes less than 1% of U.S. farmland, the Central Valley produces 25% of the nation's food, and about 50% of U.S. fruits, nuts, and vegetables [67]. Thus, understanding agricultural practices in this region will support food and water security decision making.

D. Goals and Objectives
The overarching goal of this research was to compare 3-4 m high spatial resolution 4-band PlanetScope data with 30 m hyperspectral DESIS data in classifying major agricultural crops. These crops included: alfalfa, almonds, corn, cotton, grapes, pistachios, rice, and tomatoes. These crops were selected because they are prevalent throughout the world and/or study area. In addition, many of these crops (e.g., almonds) are water-intensive and thus important to study when addressing issues of water security [1]. To achieve these objectives, six PlanetScope image mosaics and two DESIS images were acquired over California's Central Valley spanning the 2020 growing season.
Our specific objectives were to: 1) Compare crop type classifications using 4-band DESIS and PlanetScope data to assess the advantages of high spatial resolution versus hyperspectral data. In the early growing season, we used DESIS data from June 18, 2020, and PlanetScope data from June 12, 2020. We used DESIS data from August 8, 2020 with PlanetScope data from August 6, 2020 to compare accuracies in the late growing season. Finally, we combined June and August data for each sensor to compare classification accuracies using data across the growing season. 2) Use all available time-series PlanetScope data from June through September 2020 for classification to evaluate the advantages of high temporal resolution. We assessed classification accuracies when using all images and determined the best two PlanetScope images across the growing season. 3) Determine the best number of DESIS bands for classification to quantify the advantages of hyperspectral data. Although DESIS data have 235 bands, not all of them are unique and informative. Out of the authors' previously established 26 optimal hyperspectral narrow bands [14], we tested 4 to 16 bands to determine the best number of bands for classification, after which classification accuracy plateaus. 4) Fuse August 6 PlanetScope data and August 8 DESIS data for classification analysis to test a high spatial resolution and hyperspectral product. We used two methods of data fusion for classification: layer-stacking [68] and Gram-Schmidt [69].

A. Study Area
The Central Valley, commonly referred to as the "fruit and vegetable basket" of the United States (U.S.), is composed of a broad alluvial filled structural trough spanning about 52 000 km 2 within California's interior [70], [71]. It is one of the most productive agricultural regions in the world growing more than 250 different crop types with an estimated value exceeding $20 billion per year [72]. This critical agricultural system, accounting for 77% of California's water use, requires management of hydrological systems and therefore is important for understanding water use in food production [73], [74], [75].
As the Central Valley is a major agricultural producer in the U.S., the water situation has been exacerbated with the Central Valley experiencing changing dynamics in recent years regarding water availability. The Central Valley has been in a state of prolonged drought beginning in the mid-2000s that intensified into severe drought conditions from 2012 until 2016 [76], [77]. Further intensifying this strain, there have been increases in planting higher revenue/water consuming perennial crops such as nuts, grapes, and other fruits increasing from 16% of irrigated acreage in 1980 to 33% in 2015 statewide, and from 21% to 45% in the southern Central Valley [78]. This results in an urgent need to better map and classify crop types over regional extents for future food and water security studies.
The study area is in the southern San Joaquin Valley Region of California's Central Valley (Fig. 1). The polygon area used for the study is defined by the imagery boundaries and covers a relatively flat terrain that is approximately 544.79 km 2 . It is mostly located within Merced County but extends into Fresno County in its southeastern portion. California Irrigation Management Information System (CIMIS) Site 124 Panoche (Latitude: 36 • 53 24 N, Longitude: 120 • 43 53 W) resides within the study area near Firebaugh, California at an elevation of 56 m [79]. This region is part of the San Joaquin watershed (SJWS), which includes cropland, pasture-based livestock farming, and forest land use types [80]. The major crops grown in the SJWS include nuts, vegetables, cotton, fruits, and field crops. This watershed spans approximately 15 000 km 2 with about 7400 km 2 occupied by pasture, and 1100 km 2 occupied by forest [81].
The SJWS includes most agricultural areas in Stanislaus, Merced, and Madera Counties, and parts of San Joaquin and Fresno Counties. The soils are mostly clay loams to fine sandy loams [81]. Approximately 38% of the SJWS total cropland is covered by nuts and fruits; about 36% by field crops such as corn, tomato, cotton, and beans; and less than 10% by grain crops [80]. The SJWS has a typical Mediterranean climate, with cool, rainy winters and hot, dry summers. Its annual average rainfall ranges from 200 to 300 mm, most of which occurs from November to April [80].
Agriculture in the SJWS depends highly on irrigation due to its arid climatic conditions [82]. A combination of surface water and groundwater is used by local farmers to meet their irrigation needs [82]. Stress on the water supply system is growing; with limited surface water used for irrigation due to worsening droughts, many farmers have turned to groundwater [82]. Irrigation water is mostly developed and delivered by governmental institutions, such as the State Water Project and the Central Valley Project, which sell long-term water contracts [82]. Several irrigation districts such as Modesto or South San Joaquin then deliver the water to the end user via irrigation canals and aqueducts [82]. Farmers manage their own groundwater usage, which to date has not been regulated [82].
Thus, agricultural study in this area is important for addressing issues related to food and water security. To do this, we used PlanetScope images (Section II-B), DESIS images (Section II-C), and randomly distributed samples with the USDA NASS CDL [83] reference data (Section II-D). We used these data to compare accuracies across different sensors, classification methods (Section II-E), and data fusion techniques (Section II-E). Methodology is detailed in the following sections.

B. PlanetScope Data
In all, 21 PlanetScope images [20] from the Dove-R constellation were used for this study and were compiled into six mosaics: June 12, June 30, July 14, August 6, August 28, and September 30, 2020 (Table I). All images were acquired in a descending orbit, radiometrically corrected, geometrically corrected with fine Digital Elevation Model correction, and atmospherically corrected to surface reflectance by Planet Labs [20]. All images had 0% cloud cover except the June 12 images (10% and 26% cloud cover). These images were used for classifying agricultural crops through machine learning algorithms and for comparisons with DESIS data, as described in Section II-E.

C. DESIS Data
In addition to PlanetScope imagery, we downloaded two Level 2 A surface reflectance DESIS images from Teledyne Brown [16], acquired in clear sky conditions on June 18, 2020 and August 8, 2020. Due to slight mismatches between DESIS and PlanetScope images, DESIS images were georeferenced to match the PlanetScope images in ArcGIS [86]. Samples were randomly generated at least 30 m apart to match the DESIS spectral resolution within the overlapping areas of the PlanetScope and DESIS images. The data were then randomly split into training (50%), testing (25%), and validation (25%) subsets for classification. For Objective 1, DESIS bands close to the PlanetScope band centers were selected (492, 556, 666, and 867 nm). For Objective 3, the best 4, 6, 8, 10, 12, 14, and 16 DESIS bands (determined in [14]) were selected (see Table II) using peak detection methods to determine which bands were most often correlated with specific narrow spectral peaks and troughs across the study crops. These band selections also leveraged our knowledge from previous studies, in which we included feature reduction techniques like Principal Components Analysis [24]. These processed DESIS data were used for classification analyses and comparisons with PlanetScope images as described in Section II-E.

D. Reference Data
Crop type data used in this study were acquired using the United States Department of Agriculture National Agricultural Statistics Service Cropland Data Layer (USDA NASS CDL) [83] for 2020, commonly referred to as CDL. The CDL uses satellite imagery (e.g., Landsat series and Sentinel-2) and   [87]. It has been generated annually providing conterminous coverage of the United States at 30 m resolution since 2010 [88], [89], [90]. The CDL confidence layer was also used to filter samples with a threshold of 70%. The CDL 2020 overall accuracy for California was 78.8%; cropwise producer's and user's accuracies are shown in Table III [91].
There are inherent limitations in many datasets including the CDL. It is accepted there are some errors of commission and omission, which may influence the evaluation of our classification accuracies. However, the CDL is a standard dataset used for reference in several publications [88], [92], [93], [94], [95]. The number of sample locations is shown in Table IV. These samples were randomly generated using the "Create random points" tool in ArcGIS [86]. The samples were then used for training, testing, and validating classification algorithms described below in Section II-E.

E. Classification Algorithms and Data Fusion
Many supervised classification machine learning algorithms are currently available, including Random Forest (RF) and Support Vector Machines (SVM) [25]. We have found in previous research [25] that SVM outperformed other methods for classifying crop types using hyperspectral data. Thus, for crop type classification analyses in this study, SVM with a linear kernel was used to classify the eight study crops in R [52]. The cost parameter was optimized using the training and testing subsets before the optimal model was applied to the validation subset.
There are also many pixel-level, feature-level, model-based, and hybrid data fusion methods [96], [97]. For greater retention of original information, interpretability, and higher quality results [97], we selected pixel-level methods for this study. These include layer stacking; Intensity, Hue, Saturation (IHS); Brovey Transform (BT); Principal Component Analysis (PCA); and Gram-Schmidt (GS) [26], [96]. For fusing the DESIS August 8, 2020 image and the PlanetScope August 6, 2020 image mosaic, two methods were tested, both in ArcGIS [86]. All four Plan-etScope bands and the best 14 DESIS bands were used in both methods. For the first method, the DESIS image was resampled to the 3 m PlanetScope spatial resolution, and then layer stacked with the PlanetScope image mosaic. This data fusion method has often been used because it is simple and its results are easily interpretable [68].
For the second method, GS was used because out of the methods available in ArcGIS, it retains more spectral information from the lower resolution image and can easily accommodate more than three bands [69], [97]. In this method, the reflectance values of different bands from the higher spatial resolution image are used to assign weights to the corresponding bands of the higher spectral resolution image to fuse the information from both images; for details, please refer to [69]. The PlanetScope blue band (464-517 nm) was used to calculate the weights for DESIS bands at 408, 433, and 519 nm. The PlanetScope green band (547-585 nm) was used to calculate the weights of DESIS bands at 556 and 581 nm. The PlanetScope red band (650-682 nm) was used to calculate weights for DESIS bands at 648, 676, and 695 nm. Finally, the PlanetScope NIR band (846-888 nm) was used to calculate weights for DESIS bands at 755, 796, 863, 919, and 958 nm. These weights were subsequently used to run GS on each band set used to calculate weights. These layers were combined to make the fused image with a 3 m spatial resolution. Reflectance at 718 nm for the DESIS red-edge band was simulated by averaging the fused product's reflectance at 695 and 735 nm. The 14 fused bands were then used for SVM classification. Finally, accuracy assessments were conducted to compare sensors (images described in Sections II-B and II-C), classification algorithms, and data fusion methods using error matrices [98]. The error matrices derived from using validation samples (described in Section II-D) were used to calculate overall, producer's, and user's accuracies [98].

III. RESULTS
To compare the abilities of DESIS and PlanetScope to classify crops in the early growing season, we used a DESIS image acquired on June 18, 2020 and a series of PlanetScope images acquired on June 12, 2020. The DESIS image was subset to four bands close to the PlanetScope band centers (492, 556, 666, and 867 nm). The DESIS image yielded the higher overall accuracy of 57% when compared with the 50% overall accuracy using the PlanetScope image mosaic (see Table V). For DE-SIS, the producer's accuracies ranged from 0-85% and user's accuracies ranged from 28-92%; those for PlanetScope ranged from 0-100% and 31-100%. Similarly, to compare DESIS and PlanetScope classifications during the late growing season, we used a DESIS image from August 8, 2020 and a PlanetScope image from August 6, 2020. Again, the DESIS image was subset to four bands. The DESIS image yielded an overall accuracy of 66% as opposed to 61% from the PlanetScope image mosaic. DESIS producer's and user's accuracies ranged from 18-100% and 29-100%, respectively; PlanetScope producer's and user's accuracies ranged from 25-92% and 44-77%, respectively. Finally, to compare classification accuracies from DESIS and PlanetScope data across the growing season, we combined the June and August images/mosaics to create 8-band products. In this analysis, overall accuracies for DESIS and PlanetScope were 79% and 70%, respectively. DESIS producer's accuracies ranged from 45-100%, and user's accuracies ranged from 57-100%; producer's accuracies for PlanetScope ranged from 31-100% while user's accuracies ranged from 36-100%. In summary, DESIS images outperformed PlanetScope image mosaics in the early and late growing season and across the season (see Table V). In this case, hyperspectral data outperformed high spatial resolution data.
To assess whether the use of multiple PlanetScope mosaics would increase classification accuracies, we used all six mosaics, with a total of 24 bands, to classify crop types. This yielded an overall accuracy of 77%, producer's accuracies ranging from 52-100% and user's accuracies ranging from 57-100% (see Table VI). We also tested PlanetScope image mosaic pairs to see which two dates in the growing season would yield highest accuracies. The overall accuracies for band pairs ranged from 64-79%, with most image pairs having lower overall accuracies than when using all image mosaics (see Table VI). However, the higher overall accuracy of 79% was obtained using the June 30, 2020 and August 28, 2020 PlanetScope image mosaics. The producer's and user's accuracies for this pair ranged from 56-100% and 61-100%, respectively. Since the two DESIS images outperformed the six PlanetScope image mosaics throughout the growing season, these results suggest high temporal resolution may not be as important as well-timed high-quality image acquisition.   [20] WITH THE 14-BAND DEUTSCHES ZENTRUM FUR LUFTUND RAUMFAHRT (DLR) EARTH SENSING IMAGING SPECTROMETER (DESIS) [16] AUGUST 8, 2020 IMAGE To determine the optimal number of DESIS bands for crop type classification, we ran classification models using the 4, 6, 8, 10, 12, 14, and 16 bands listed in Table II. Overall accuracies increased from 61% using 4 bands to 82% using 16 bands, with a peak of 85% accuracy using 14 bands (Fig. 2, Table VII). Although differing by crop types, the general trend of increasing and then plateauing accuracy was seen for producer's and user's accuracies when all crop types were grouped together (see Table VII). Similar to high temporal resolution, an optimal number of informative bands led to a higher accuracy than indiscriminate use of many bands.
Both methods fusing the PlanetScope August 6, 2020 image mosaic with the 14-band DESIS August 8, 2020 image resulted in similar overall accuracies of 82% for the layer stacking method and 81% for the GS method (see Table VIII).
Producer's and user's accuracies were also similar for both methods. These accuracies were lower than those using only the 14-band DESIS August image, suggesting data fusion may not be the best way to take advantage of both datasets in this case.
In summary, the 14-band August DESIS image yielded the highest overall accuracy (see Table IX), followed by the 16-and 12-band DESIS images. After that, the highest accuracy was achieved using the layer stacking and then GS fusion products. The 10-band DESIS image yielded the same accuracy as the GS analysis, followed by the 8-band DESIS image. The next highest accuracy was achieved using a late June and late August PlanetScope mosaic pair and using the early June and early August DESIS image pair. This was followed by the PlanetScope time-series of six mosaics.

IV. DISCUSSION
There is an urgent need to map agriculture more accurately to support management decision making. Here we demonstrate applications of new remote sensing products to map a selection of crops in California's Central Valley. This is especially important for the water intensive crops included in this study to better understand their water use for future food and water security [1]. To do this, we compared the abilities of high spatial resolution PlanetScope data and hyperspectral DESIS data to classify eight agricultural crops in California's Central Valley during the 2020 growing season. Both PlanetScope and DESIS data have several strengths and some limitations. For example, PlanetScope's constellation of approximately 200 satellites allows for much more frequent data acquisition than the single DESIS sensor on the ISS. PlanetScope data are of much higher spatial resolution (nominal 3 m) than DESIS data (nominal 30 m), allowing for the capture of more pure spectral signatures. On the other hand, DESIS data consist of 235 narrow bands allowing us to capture more detailed spectral data than possible with PlanetScope data, which have four broad bands. Additionally, having only one sensor means that DESIS data can be more radiometrically consistent than PlanetScope data gathered by multiple cubesats. Here, we ran a series of classification analyses to compare how the characteristics of these two datasets help classify agricultural crops.
For these classifications, we used the USDA NASS CDL for labeling our training, testing, and validation data. This product has inherent inaccuracies due to its large geographic extent and moderate spatial resolution. Nevertheless, this is a widely used and robust dataset that is freely available. To minimize uncertainties, we used the confidence data layer to only select sample locations that had a 70% or higher confidence level. Thus, our results will also have similar levels of inaccuracies inherent in the reference data. Moving forward, ground validation of reference data would reduce uncertainty in future work.
Based on these analyses, we found hyperspectral DESIS data outperformed high spatial resolution PlanetScope data in the differentiation of agricultural crops even when only four bands with the same band centers were used. The narrow spectral bandwidth of DESIS provided more specific spectral signatures than averaged spectral signatures gathered over the broad bands of PlanetScope (e.g., 2.55 nm red band of DESIS versus 80 nm red band of PlanetScope). This difference in narrow band and broad band signatures causes differences in the measurements of a particular crop and thus classification accuracies. This result supports other research that found high spectral resolution data outperformed high spatial resolution data. For example, Bannari et al. [99] found the airborne hyperspectral sensor Probe-1 outperformed high spatial resolution sensor IKONOS in mapping crop residue. Similarly, Skakun et al. [100] found the 30 m harmonized Landsat and Sentinel product outperformed the 3-4 m Planet data in estimating yield variability.
Researchers have taken advantage of the larger amount of spectral information from hyperspectral data for many applications. For example, Paul et al. [101] used DESIS images for target detection. The hyperspectral Airborne Imaging System for Applications (AISA) Eagle data have been used to differentiate healthy, infested, and lightning-damaged pine trees [102] and to classify crops [103]. AVIRIS-NG images have been used for agricultural field boundary mapping [104] and crop type and agricultural class mapping [105], [106], [107]. UAV-based hyperspectral data have also been used for estimation of soil moisture [108], alfalfa yield [109], and winter wheat yield [110].
In this study, DESIS outperformed PlanetScope for crop type classification in California, where, as of 2019, farm sizes average 141 hectares [111]. However, high spatial resolution may be more important in other areas, such as places in Africa, where small-holder agricultural practices are more common [6]. These small pixels may also be more important for differentiating vegetation types in heterogeneous areas, such as forests, than for monocultural agricultural fields. High spatial resolution data have been used for many applications. For example, Turker and Ozdarici [112] found higher spatial resolution IKONOS (4 m) and QuickBird (2.4 m) images outperformed the coarser resolution Satellite Pour l'Observation de la Terre (SPOT) (10 m and 20 m) images in classifying agricultural crops. IKONOS images have also been used to differentiate between conventional and conservation tillage practices [113]. Similarly, WorldView-2 and WorldView-3 images have been used to classify agricultural cover classes [114], estimate mango and avocado yields [115], [116], detect oil palm crowns to estimate their age [117], map cover crop residue [118], and count coconut trees [119]. When estimating rice nitrogen status, Huang et al. [120] found 2 m WorldView-2 and 5 m RapidEye images outperformed 8 m FORMOSAT-2 images. One reason for the lower classification accuracies from PlanetScope may be the limited number of bands in the visible and near infrared (NIR) region, and the lack of bands in the shortwave infrared (SWIR) region (also absent in DESIS data). When using WorldView-3 images, Sidike et al. [114] and Hively et al. [118] found the narrow SWIR bands of WorldView-3 important for classification. Similarly, Huang et al. [120] found WorldView-2 outperformed RapidEye because of the three additional WorldView-2 bands in the visible and NIR regions. The simple addition of a red-edge band and one or two SWIR bands to PlanetScope sensors may substantially increase classification accuracies, perhaps even over DESIS which has no SWIR bands. The recently available 8-band PlanetScope data include a red-edge band and an additional green band, which could increase future classification accuracies.
We also found the filtering and use of select images and bands led to higher accuracies than using many images and bands. Previous work by the authors also demonstrates this [7], [9], [15], [25]. However, the best image and band combinations will depend on the application, study area, and time of year (e.g., cloud-cover during the monsoon season in the tropics). Thus, the high temporal and high spectral resolutions of PlanetScope and DESIS data enable selection of best images and bands for various applications. Finally, we found the fused data yielded a lower classification accuracy than the use of the 14-band DESIS August image alone. This may be a result of the large difference in spatial resolutions of the PlanetScope and DESIS images, which decreases their spatial compatibility [26]. However, with the emergence of new fusion techniques [22], [23], there may be a way to create a fusion product yielding higher classification accuracies. New machine learning and deep learning based data fusion methods (like those described in [22], [23], [31], [96], [121]) may produce better high spatial resolution and hyperspectral datasets. The addition of hyperspectral capabilities to cubesat/smallsat constellations would enable many studies [122]. Combinations of hyperspectral and high spatial resolution data have been used to study a tropical coast [123], macroalgae [124], the invasive guava shrub [125], and smallholder crop area [126]. In addition, data that are high spatial resolution and hyperspectral have been used. For example, Airborne APEX hyperspectral and high spatial resolution data have been used to create high resolution urban land-cover maps, assess burn severity from wildfires, detect change in natural areas, map vegetation species, detect early stages of vegetation stress, map biomass in alpine grassland, evaluate changes in tree physiology due to acid deposition, and retrieve aerosol optical depth [127]. Similarly, Pena et al. [128] used the hyperspectral and high spatial resolution data from the HySpex-VNIR 1600 airborne platform to classify Sclerophyll forest tree species.
There have been many papers using only DESIS or only Planet data, but not using both, especially in agricultural research. Comparisons between PlanetScope and DESIS data for other research and/or in other areas may be different. Both PlanetScope and DESIS data are important new data sources and need to be evaluated for various applications. For example, the high spatial resolution of PlanetScope data has been leveraged in texture analysis to estimate pasture aboveground biomass and canopy height [46]; extract arecanut planting distribution [47]; monitor forest carbon stocks and emissions [129]; classify and detect changes in land use, land cover [130], [131]; study vegetation phenology [132]; and assess wildfire damage [133]. In addition, both hyperspectral and high spatial resolution datasets have been found useful for estimating crop water productivity, a measure of how efficiently crops use water to produce food [1], [134], [135], [136], [137], [138], [139]. This will be important for better management and potential water-saving strategies in agroecosystems at large extents.
This study has demonstrated the ability of hyperspectral and high spatial resolution remote sensing data to observe and analyze agricultural crops. As the world population along with land and water use change is increasing, rapid monitoring of these changes can help provide insight on global food and water security. Whereas this study has classified crops in the heavily monitored Central Valley of California, the techniques proven here can be important in regions lacking on-site data. The emergence of advancing remote sensing technologies can help to better understand ways to adapt to limited fresh water in the 21st century. With accelerating demands for water, monitoring cropping patterns and crop type classification are essential for sustainability in agriculture. The analysis done in this study has potential to apply to other hyperspectral/high spatial resolution remote sensing sensors and datasets.

V. CONCLUSION
This study has performed one of the first comprehensive comparisons of two new generation hyperspectral DESIS and high spatial resolution PlanetScope sensors to classify agricultural crops. From our analysis of eight world crops (alfalfa, almonds, corn, cotton, grapes, pistachios, rice, and tomatoes) in California's Central Valley, DESIS hyperspectral data outperformed PlanetScope's high spatial resolution data in classifying agricultural crops. Overall accuracies in classifying crops increased from 61% using 4 bands to 82% using 16 bands, with the highest overall classification accuracy of 85% using 14 bands. Overall, out of the DESIS hyperspectral narrowbands (HNB) determined in [14], 14 yielded the highest classification accuracies in this study. This implies that about 6% of the 235 HNB are optimal, making the other 94% noisy or redundant in crop classification. One of the limitations of such studies that combine hyperspectral and high spatial resolution data is the dissimilar spatial resolution of two sensors (3-4 m PlanetScope data versus 30 m DESIS data). Ideally, the spectral integrity within the pixel is better preserved when data of similar spatial resolutions are fused. As hyperspectral data become higher in spatial resolution, high spatial resolution data include more bands, and data fusion methods continue to advance [22], [23], fused products may yield higher classification results than either dataset alone.
The wavebands other than those deemed optimal are still likely to be important for other applications such as for crop biophysical and biochemical characteristics [15]. Also, a full spectral analysis method like spectral matching techniques would require spectral signatures rather than optimal HNB. In contrast high spatial resolution images gathered throughout the growing season did not yield higher classification accuracies than a single hyperspectral image gathered at a critical growth stage. Furthermore, this study has demonstrated that classification accuracies did not increase with the fusion of PlanetScope and DESIS images. We attribute this to highly dissimilar spatial resolutions of hyperspectral DESIS and high spatial resolution PlanetScope data. Nevertheless, PlanetScope data will be advantageous in other applications such as precision farming, especially for small and fragmented farms in much of the world.