Echo State Network Based on Improved Knowledge Distillation for Edge Intelligence
-
Graphical Abstract
-
Abstract
Echo state network (ESN) as a novel artificial neural network has drawn much attention from time series prediction in edge intelligence. ESN is slightly insufficient in long-term memory, thereby impacting the prediction performance. It suffers from a higher computational overhead when deploying on edge devices. We firstly introduce the knowledge distillation into the reservoir structure optimization, and then propose the echo state network based on improved knowledge distillation (ESN-IKD) for edge intelligence to improve the prediction performance and reduce the computational overhead. The model of ESN-IKD is constructed with the classic ESN as a student network, the long and short-term memory network as a teacher network, and the ESN with double loop reservoir structure as an assistant network. The student network learns the long-term memory capability of the teacher network with the help of the assistant network. The training algorithm of ESN-IKD is proposed to correct the learning direction through the assistant network and eliminate the redundant knowledge through the iterative pruning. It can solve the problems of error learning and redundant learning in the traditional knowledge distillation process. Extensive experimental simulation shows that ESN-IKD has a good time series prediction performance in both long-term and short-term memory, and achieves a lower computational overhead.
-
-