Skip to Main Content
A neural network tree (NNTree) is a hybrid learning model with the overall structure being a decision tree (DT), and each nonterminal node containing a neural network (NN). Using NNTrees, it is possible to learn new knowledge online by adjusting the NNs in the nonterminal nodes. It is also possible to understand the learned knowledge online because the NNs in the nonterminal nodes are usually very small, and can be interpreted easily. Currently, we have studied retraining of the NNTrees; by adjusting the NNs in the nonterminal nodes. The structure of the trees is fixed during retraining. We found that this kind of retraining is good for size reduction in offline learning, if the training set is highly redundant. However, updating the NNs alone is not enough for online learning. In this paper, we introduce two methods for online learning of NNTrees. The first one is SGU (simple growing up), and the second one is GUWL (growing up with learning). The effectiveness of these methods are compared with each other through experiments with several public databases.