XNOR Neural Engine: A Hardware Accelerator IP for 21.6-fJ/op Binary Neural Network Inference | IEEE Journals & Magazine | IEEE Xplore