The problem of Krylov subspace estimation based on the sample covariance matrix is addressed. The focus is on signal processing applications where the Krylov subspace is defined by the unknown second-order statistics of the observed samples and the signature vector associated with the desired parameter. In particular, the consistency of traditionally optimal sample estimators is revised and analytically characterized under a practically more relevant asymptotic regime, according to which not only the number of samples but also the observation dimension grow without bound at the same rate. Furthermore, an improved construction of a class of Krylov subspace estimators is proposed based on the generalized consistent estimation of a set of vector-valued power functions of the observation covariance matrix. To that effect, an extension of some known results from random matrix theory on the estimation of certain spectral functions of the covariance matrix to the convergence of not only the covariance eigenspectrum but also the associated eigensubspaces is provided. A new family of estimators is derived that generalizes conventional implementations by proving to be consistent for observations of arbitrarily high dimension. The proposed estimators are shown to outperform traditional constructions via the numerical evaluation of the solution to two fundamental problems in sensor array signal processing, namely the problem of estimating the power of an intended source and the estimation of the principal eigenspace and dominant eigenmodes of a structured covariance matrix.