Skip to Main Content
A field experiment with an L-band radiometer at 1.4 GHz was performed from May-July 2004 at an experimental site near Zurich, Switzerland. Before the experiment started, clover grass was seeded. Thermal infrared, in situ temperature, and time-domain reflectometer (TDR) measurements were taken simultaneously with hourly radiometer measurements. This setup allowed for investigation of the microwave optical depths and mode opacities (parallel and perpendicular to the soil surface) of the clover grass canopy. Optical depths and opacities were determined by in situ analysis and remotely sensed measurements using a nonscattering radiative transfer model. Due to the canopy structure, optical depth and opacity depend on the polarization and radiometer direction, respectively. A linear relation between vegetation water-mass equivalent and polarization-averaged optical depth was observed. Furthermore, measured and modeled radiative transfer properties of the canopy were compared. The model is based on an effective-medium approach considering the vegetation components as ellipsoidal inclusions. The effect of the canopy structure on the opacities was simulated by assuming an anisotropic orientation of the vegetation components. The observed effect of modified canopy structure due to a hail event was successfully reproduced by the model. It is demonstrated that anisotropic vegetation models should be used to represent the emission properties of vegetation. The sensitivity of radiometer measurements to soil water content was investigated in terms of the fractional contribution of radiation emitted from the soil to total radiation. The fraction of soil-emitted radiation was reduced to approximately 0.3 at the most developed vegetation state. The results presented contribute toward a better understanding of the interaction between L-band radiation and vegetation canopies. Such knowledge is important for evaluating data generated from future satellite measurements.