Skip to Main Content
Computerized detection schemes have the potential of increasing diagnostic accuracy in medical imaging by alerting radiologists to lesions that they initially overlooked. These schemes typically employ multiple parameters such as threshold values or filter weights to arrive at a detection decision. In order for the system to have high performance, the values of these parameters need to be set optimally. Conventional optimization techniques are designed to optimize a scalar objective function. The task of optimizing the performance of a computerized detection scheme, however, is clearly a multiobjective problem: the authors wish to simultaneously improve the sensitivity and false-positive rate of the system. In this work the authors investigate a multiobjective approach to optimizing computerized rule-based detection schemes. In a multiobjective optimization, multiple objectives are simultaneously optimized, with the objective now being a vector-valued function. The multiobjective optimization problem admits a set of solutions, known as the Pareto-optimal set, which are equivalent in the absence of any information regarding the preferences of the objectives. The performances of the Pareto-optimal solutions can be interpreted as operating points on an optimal free response receiver operating characteristic (FROG) curve, greater than or equal to the points on any possible FROG curve for a given dataset and detection scheme. It is demonstrated that generating FROG curves in this manner eliminates several known problems with conventional FROG curve generation techniques for rule-based detection schemes. The authors employ the multiobjective approach to optimize a rule-based scheme for clustered mirocalcification detection that has been developed in the authors' laboratory.