A Multi-Armed Bandit selection strategy for Hyper-heuristics | IEEE Conference Publication | IEEE Xplore