The continued scaling of the minimum feature size of contemporary chips has made circuit performance increasingly susceptible to the process variations. Many approaches have been proposed to estimate the circuit performance bounds with respect to process or circuit parameter variations in the recent years. The Monte Carlo method is the most popular one among them. However, this method usually produces underestimated results and needs a large number of simulation runs to achieve an accurate estimation. The approach based on Kharitonov's method has been recently proposed. This method requires all coefficient variations in the system transfer function to be independent from each other. Unfortunately, most real circuits do not satisfy this constraint. Therefore, it tends to overestimate the performance bounds in real application due to the parameter-independent requirement. This short paper proposes an optimization approach on a transfer function of a linear circuit to evaluate the performance bounds under process variations. The magnitude and phase bounds of a linear system can be calculated by the proposed method at each frequency point. Furthermore, the parameter-independent requirement in Kharitonov's method is resolved by the proposed method. The proposed method has been applied to a CMOS two-stage amplifier. The experimental result shows that it evaluates the magnitude and phase bounds of a linear system accurately in much less computation time as compared with the Monte Carlo method. All experimental results were carried out using a standard 0.35-μm CMOS process technology.