Land surface temperature (LST) and sea surface temperature (SST) are important quantities for many environmental models. Remote sensing is a source of information for their estimation on both regional and global scales. Many algorithms have been devised to estimate LST and SST from satellite data, most of which require apriori information about the surface and the atmosphere. A recently proposed approach involves the use of support vector machines (SVMs). Based on satellite data and corresponding insitu measurements, they generate an approximation of the relation between them, which can subsequently be used to estimate unknown surface temperatures from additional satellite data. Such a strategy requires the user to set several internal parameters. In this paper, a method is proposed for automatically setting these parameters to quasi-optimal values in the sense of minimum estimation errors. This is achieved by minimizing a functional correlated to regression errors (i.e., the ldquospan-boundrdquo upper bound on the leave-one-out (LOO) error) which can be computed by using only the training set, without need for a further validation set. In order to minimize this functional, Powell's algorithm is adopted, since it is applicable also to nondifferentiable functions. Experimental results yielded by the proposed method are similar in accuracy to those achieved by cross-validation and by a grid search for the parameter configuration which yields the best test-set accuracy. However, the proposed method gives a dramatic reduction in the computational time required, particularly when many training samples are available.