More than two years after the launch of the Soil Moisture and Ocean Salinity (SMOS) mission in November 2009, the level 1C brightness temperatures (TB) reprocessed with the up-to-date ESA level 1 processing version (the level 1 processor V5.04 and V5.05) have been released. Systematic biases of several Kelvins are still observed between averaged TB measurements and simulations, and depend on the location of the measurement in the field of view (FOV). The systematic biases may originate from imperfections in instrument calibration, image reconstruction, TB forward model, and in removing influence of external sources (Sun, galaxy, etc.). These systematic biases are monitored using the so-called “Ocean Target Transformation” (OTT) and are analyzed during two years (May 2010-April 2012). The peak to peak variations in OTTs are higher than 1 K in magnitude. We find large variations of OTTs where Sun contaminations are expected, in locations affected by Sun aliases and Sun tails. The seasonal variation of OTTs computed over descending passes is opposite to the seasonal variation of the physical antenna patch temperature of the noise injection radiometer (Tp7), whereas the link between the temporal variation of OTTs during ascending passes with the one of Tp7 is less clear. In addition, there are no clear correlations between seasonal variations in OTTs in terms of Stokes 1 and geophysical parameters, such as scattered galactic signal. This suggests that the seasonal variations in OTTs is likely to come from the instrument heating and the imperfect L1 image reconstruction.