Visions of future power systems contain high penetrations of inverters which are used to convert power from dc (direct current) to ac (alternating current) or vice versa. The behavior of these devices is dependent upon the choice and implementation of the control algorithms. In particular, there is a tradeoff between dc bus ripple and ac power quality. This study examines the tradeoffs. Four control modes are examined. Mathematical derivations are used to predict the key implications of each control mode. Then, an inverter is studied both in simulation and in hardware at the 10 kVA scale, in different microgrid environments of grid impedance and power quality. It is found that voltage-drive mode provides the best ac power quality, but at the expense of high dc bus ripple. Sinusoidal current generation and dual-sequence controllers provide relatively low dc bus ripple and relatively small effects on power quality. High-bandwidth dc bus ripple minimization mode works well in environments of low grid impedance, but is highly unsuitable within higher impedance microgrid environments and/or at low switching frequencies. The findings also suggest that the certification procedures given by G5/4, P29 and IEEE 1547 are potentially not adequate to cover all applications and scenarios.