Sequential Monte Carlo (SMC) methods (particle filtering techniques) are a set of powerful and versatile simulation-based methods to perform optimal state estimation in nonlinear non-Gaussian state space models. In this approach, the posterior probability distributions of interest are estimated using a cloud of random samples which are carried over time using importance sampling and resampling techniques. Current algorithms are typically designed so as to optimize some "local" criteria such as the conditional variance of the importance weights in the importance sampling step. However, the effect of these local optimizations is not clear on the global performance of the algorithm; e.g. sampling with a nonlocally optimal importance distribution might be beneficial at further time steps. We present here an alternative principled approach where the SMC is parametrized and its parameters optimized with respect to a "global" performance measure such as the (time) average entropy of the importance weights. This is achieved using stochastic approximation techniques. We demonstrate the efficiency of this approach through simulation.