In a deregulated power market, bidding decisions rely on good market clearing price prediction. One of the common forecasting methods is Gaussian radial basis function (GRBF) networks that approximate input-output relationships by building localized Gaussian functions (clusters). Currently, a cluster uses all the input factors. Certain input factors, however, may not be significant and should be deleted because they mislead local learning and result in poor predictions. Existing pruning methods for neural networks examine the significance of connections between neurons, and are not applicable to deleting center and standard deviation parameters in a GRBF network since those parameters bear no sense of significance of connection. In this paper, the inverses of standard deviations are found to capture a sense of connection, and based on this finding, a new training method to identify and eliminate unimportant input factors is developed. Numerical testing results from two classroom problems and from New England Market Clearing Price prediction show that the new training method leads to significantly improved prediction performance with a smaller number of network parameters.