Bolometers are radiation sensors designed to have a spectral response as constant as possible in the region of interest. In high-temperature plasmas, the main radiation output is in the ultraviolet and SXR part of the spectrum and the metal foil bolometers are special detectors developed for this interval. For such sensors, as in general for all bolometers, the absolute calibration is a crucial issue. This problem becomes particularly severe when, like in nuclear fusion, the sensors are not easily accessible. In this article, a detailed description of the in situ calibration methods for the bolometer sensitivity S and the cooling time τc, the two essential parameters characterizing the behavior of the sensor, is provided and an estimate of the uncertainties for both constants is presented. The sensitivity S is determined via an electrical calibration, in which the effect of the cables connecting the bolometers to the powering circuitry is taken into account leading to an effective estimate for S. Experimental measurements confirming the quality of the adopted coaxial cable modelling are reported. The cooling time constant τc is calculated via an optical calibration, in which the bolometer is stimulated by a light-emitting diode. The behavior of τc in a broad pressure range is investigated, showing that it does not depend upon this quantity up until 10-2 mbar, well above the standard operating conditions of many applications. The described methods were tested on 36 bolometric channels of RFX tomography, providing a significant statistical basis for present applications and future developments of both the calibration procedures and the detectors. © 2004 American Institute of Physics.