Skip to Main Content
The paper introduces weighted Distributed Latent Dirichlet allocation for cluster ensemble. The idea is that for cluster ensemble we think every base clustering is not equally important, and thus we give the different weight to each base clustering and assume that the results of each base clustering is a multinomial distribution. First, we state a soft cluster ensemble model of WLDA and the latent variables in WLDA is defined for cluster ensemble. Second, WLDA is inferred with variation approximation and EM algorithm for WLDA is stated. Third, we choose some dataset of large number of instances for experiment. Compared with MCLA, CSPA and HGPA, WLDA runs a better results and furthermore the outputs of WLDA can show the structure of data points.