Large multi-input multi-output (MIMO) systems with tens or hundreds of antennas have shown great potential for next generation of wireless communications to support high spectral efficiencies. However, due to the non-deterministic polynomial hard nature of MIMO detection, large MIMO systems impose stringent requirements on the design of reliable and computationally efficient detectors. Recently, lattice reduction (LR) techniques have been applied to improve the performance of low-complexity detectors for MIMO systems without increasing the complexity dramatically. Most existing LR algorithms are designed to improve the orthogonality of channel matrices, which is not directly related to the error performance. In this paper, we propose element-based lattice reduction (ELR) algorithms that reduce the diagonal elements of the noise covariance matrix of linear detectors and thus enhance the asymptotic performance of linear detectors. The general goal is formulated as solving a "shortest longest vector reduction" or a stronger version, "shortest longest basis reduction," both of which require high complexity to find the optimal solution. Our proposed ELR algorithms find sub-optimal solutions to the reductions with low complexity and high performance. The fundamental properties of the ELR algorithms are investigated. Simulations show that the proposed ELR-aided detectors yield better error performance than the existing low-complexity detectors for large MIMO systems while maintaining lower complexity.