Abstract:
Multitask genetic programming (GP) methods have been applied to various domains, such as classification, regression, and combinatorial optimization problems. Most existin...Show MoreMetadata
Abstract:
Multitask genetic programming (GP) methods have been applied to various domains, such as classification, regression, and combinatorial optimization problems. Most existing multitask GP methods are designed based on tree-based structures, which are not good at reusing building blocks since each subtree passes its outputs to only one parent. It may limit the design and performance of knowledge sharing in multitask optimization. Different from tree-based GP, building blocks in linear GP (LGP) can be easily reused by more than one parent. Besides, existing multitask GP methods always allocate each individual to a specific task and have to duplicate genetic materials from task to task in knowledge transfer, which is inefficient and often produces redundancy. Contrarily, it is natural for an LGP individual to produce multiple distinct outputs, which enables each LGP individual to solve multiple tasks simultaneously. With this in mind, we propose a new multitask LGP method that transfers knowledge via multioutput individuals (i.e., shared individuals among tasks). By integrating different solutions into one multioutput individual, the proposed method efficiently reuses common knowledge among tasks and maintains distinct behaviors for each task. The empirical results show that the proposed method has a significantly better test performance than state-of-the-art multitask GP methods. Further analyses verify that the new knowledge transfer mechanism can adjust the transfer rate automatically and, thus, improves its effectiveness.
Published in: IEEE Transactions on Evolutionary Computation ( Volume: 28, Issue: 6, December 2024)