Low-noise temperature measurements at frequencies in the millihertz range are required in the laser interferometer space antenna (LISA) and LISA PathFinder missions. The required temperature stability for LISA is around 10 μK Hz-1/2 at frequencies down to 0.1 mHz. In this paper we focus on the identification and reduction in a source of excess noise detected when measuring time-varying temperature signals. This is shown to be due to nonidealities in the analog-to-digital converter (ADC) transfer curve, and degrades the measurement by about one order of magnitude in the measurement bandwidth when the measured temperature drifts by a few ∼μK s-1. In a suitable measuring system for the LISA mission, this noise needs to be reduced. Two different methods based on the same technique have been implemented, both consisting in the addition of dither signals out of band to mitigate the ADC nonideality errors. Excess noise of this nature has been satisfactorily reduced by using these methods when measuring temperature ramps up to 10 μK s-1.