Abstract:
One difficult problem we are often faced with in clustering analysis is how to choose the number of clusters. We propose to choose the number of clusters by optimizing th...Show MoreMetadata
Abstract:
One difficult problem we are often faced with in clustering analysis is how to choose the number of clusters. We propose to choose the number of clusters by optimizing the Bayesian information criterion (BIC), a model selection criterion in the statistics literature. We develop a termination criterion for the hierarchical clustering methods which optimizes the BIC criterion in a greedy fashion. The resulting algorithms are fully automatic. Our experiments on Gaussian mixture modeling and speaker clustering demonstrate that the BIC criterion is able to choose the number of clusters according to the intrinsic complexity present in the data.
Date of Conference: 15-15 May 1998
Date Added to IEEE Xplore: 06 August 2002
Print ISBN:0-7803-4428-6
Print ISSN: 1520-6149