Abstract:
Exploiting both labeled and unlabeled instances of various problems seems a really promising strategy, since useful information that is contained on the latter pool of da...Show MoreMetadata
Abstract:
Exploiting both labeled and unlabeled instances of various problems seems a really promising strategy, since useful information that is contained on the latter pool of data is discarded during supervised approaches. However, the size of the unlabeled data that needs to be examined is usually extremely large and efficient algorithms should be utilized in such cases. Hidden Naive Bayes (HNB) model constitutes a computational cheap variant of Bayesian networks. In this work, HNB has been used as the base classifier of Self-training scheme for classification problems. Its results over 36 UCI datasets prove that a robust behavior can be achieved with only one hidden layer even under strict time restrictions.
Published in: 2016 7th International Conference on Information, Intelligence, Systems & Applications (IISA)
Date of Conference: 13-15 July 2016
Date Added to IEEE Xplore: 19 December 2016
ISBN Information: