Skip to Main Content
While fuzzy inference systems (FISs) have been extensively studied in the past decades, the minimum enclosing ball (MEB) problem was recently introduced to develop fast and scalable methods in pattern classification and machine learning. In this paper, the relationship between these two apparently different data modeling techniques is explored. First, based on the reduced-set density estimator, a bridge between the MEB problem and the FIS is established. Then, an important finding that the Mamdani-Larsen FIS (ML-FIS) can be translated into a special kernelized MEB problem, i.e., a center-constrained MEB problem under some conditions, is revealed. Thus, fast kernelized MEB approximation algorithms can be adopted to construct ML-FIS in an efficient manner. Here, we propose the use of a core vector machine (CVM), which is a fast kernelized MEB approximation algorithm for support vector machine (SVM) training, to accomplish this task. The proposed fast ML-FIS training algorithm has the following merits: (1) the number of fuzzy rules can be automatically determined by the CVM training and (2) fast ML-FIS training on large datasets can be achieved as the upper bound on the time complexity of learning the parameters in ML-FIS is linear with the dataset size N and the upper bound on the corresponding space complexity is theoretically independent of N. Our experiments on simulated and real datasets confirm these advantages of the proposed training method, and demonstrate its superior robustness as well. This paper not only represents a very first study of the relationship between MEB and FIS, but it also points out the mutual transformation between kernel methods and FISs under the framework of the Gaussian mixture model and MEB.
Date of Publication: Feb. 2009