In recurrent neural networks, excessive activity oscillations can be very disruptive to successful learning performance. Inhibitory feedback can be used to offset a dominating positive feedback; however, synaptic modification at excitatory synapses would seem to require synaptic modification at inhibitory synapses if activity is to be controlled during training. Here, we present a novel synaptic modification rule that governs the synaptic strength of afferents (inputs) to activity controlling inhibitory interneurons. A hippocampal CA3 model incorporating this rule can avoid certain performance destroying activity oscillations. In the minimal model used here, this new rule for synaptic modification implements an error-correcting-like procedure at each excitatory input to a global feedback inhibitory interneuron. Simulations that include this novel modification rule demonstrate robust sequence learning as well as the elimination of major activity fluctuations that are outside the biologically plausible range. Importantly, simulations using this rule are able to adapt quickly and selectively to large, discontinuous jumps in the training sequences.