Loading [MathJax]/extensions/MathZoom.js
A Review for Weighted MinHash Algorithms | IEEE Journals & Magazine | IEEE Xplore

A Review for Weighted MinHash Algorithms


Abstract:

Data similarity (or distance) computation is a fundamental research topic which underpins many high-level applications based on similarity measures in machine learning an...Show More

Abstract:

Data similarity (or distance) computation is a fundamental research topic which underpins many high-level applications based on similarity measures in machine learning and data mining. However, in large-scale real-world scenarios, the exact similarity computation has become daunting due to “3V” nature (volume, velocity and variety) of big data. In this case, the hashing techniques have been verified to efficiently conduct similarity estimation in terms of both theory and practice. Currently, MinHash is a popular technique for efficiently estimating the Jaccard similarity of binary sets and furthermore, weighted MinHash is generalized to estimate the generalized Jaccard similarity of weighted sets. This review focuses on categorizing and discussing the existing works of weighted MinHash algorithms. In this review, we mainly categorize the weighted MinHash algorithms into quantization-based approaches, “active index”-based ones and others, and show the evolution and inherent connection of the weighted MinHash algorithms, from the integer weighted MinHash ones to the real-valued weighted MinHash ones. Also, we have developed a Python toolbox for the algorithms, and released it in our github. We experimentally conduct a comprehensive study of the standard MinHash algorithm and the weighted MinHash ones in the similarity estimation error and the information retrieval task.
Published in: IEEE Transactions on Knowledge and Data Engineering ( Volume: 34, Issue: 6, 01 June 2022)
Page(s): 2553 - 2573
Date of Publication: 02 September 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.