Volume 29 Issue 6
Dec.  2020
Turn off MathJax
Article Contents
CHEN Jinyin, LIN Xiang, GAO Shangtengda, et al., “A Fast Evolutionary Learning to Optimize CNN,” Chinese Journal of Electronics, vol. 29, no. 6, pp. 1061-1073, 2020, doi: 10.1049/cje.2020.09.007
Citation: CHEN Jinyin, LIN Xiang, GAO Shangtengda, et al., “A Fast Evolutionary Learning to Optimize CNN,” Chinese Journal of Electronics, vol. 29, no. 6, pp. 1061-1073, 2020, doi: 10.1049/cje.2020.09.007

A Fast Evolutionary Learning to Optimize CNN

doi: 10.1049/cje.2020.09.007
Funds:  This work is supported by Zhejiang Provincial Natural Science Foundation of China (No.LY19F020025) and Major Special Funding for "Science and Technology Innovation 2025" in Ningbo (No.2018B10063).
More Information
  • Corresponding author: LIN Xiang (corresponding author) is a Master student at the Institute of Information engineering, Zhejiang University of Technology. His research interest covers data mining and applications, social network data mining, and intelligent computing. (Email:lynnzlnx@163.com)
  • Received Date: 2019-10-31
  • Publish Date: 2020-12-25
  • Deep neural networks (DNNs) show great performance in lots of applications. Convolutional neural network (CNN) is one of the classic DNNs, and various modified CNNs have been brought up, such as DenseNet, GoogleNet, ResNet, etc. For diverse tasks, a unique structure of CNN may show its advantage. However, how to design an effective CNN model for a practical task is a puzzle. In this paper, we model the architecture optimization of CNN as an optimization problem and design a Genetic network programming based Fast evolutionary learning (GNP-FEL) to optimize CNN. GNP-FEL contains three main ideas: First GNP is adopted to optimize CNN architecture and hyperparameters, which can build diverse network structures and make network parameters selfevolve; Second multi-objective optimization is designed by balancing both CNN model efficiency and structure compactness; Last a novel incremental training method is proposed to train offspring CNN models in GNP, which is capable of reducing time complexity sharply. Experiments have validated that GNP-FEL can quickly evolve a CNN classifier with a sufficiently compact architecture. And the classifier has a comparable classification effect to state-ofthe-art CNN model.
  • loading
  • Q. Xuan, H. Xiao, C. Fu, et al., "Evolving convolutional neural network and its application in fine-grained visual categorization", IEEE Access, Vol.6, pp.31110-31116, 2018.
    R. Collobert and J. Weston, "A unified architecture for natural language processing:Deep neural networks with multitask learning", Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland, pp.160-167, 2008.
    Y. Liu, Y. Fan and J. Chen, "Flame images for oxygen content prediction of combustion systems using DBN", Energy and Fuels, Vol.31, No.8, pp.8876-8783, 2017.
    Q. Xuan, B. Fang, Y. Liu, et al., "Automatic pearl classification machine based on multistream convolutional neural network", IEEE Transactions on Industrial Electronics, Vol.65, No.8, pp.6538-6547, 2018.
    Y. Liu, C. Yang, Z. Gao, et al., "Ensemble deep kernel learning with application to quality prediction in industrial polymerization processes", Chemometrics and Intelligent Laboratory Systems, Vol.174, pp.15-21, 2018.
    J. Li, J. Zhang, D. Chang, et al., "Computer-assisted detection of colonic polyps using improved faster R-CNN", Chinese Journal of Electronics, Vol.28, No.4, pp.718-724, 2019.
    O. Levy and Y. Goldberg, "Neural word embedding as implicit matrix factorization", Advances in Neural Information Processing Systems, Vol.3, pp.2177-2185, 2014.
    A. Krizhevsky, I. Sutskever and G. E. Hinton, "Imagenet classification with deep convolutional neural networks", Advances on Neural Information Processing Systems, Lake Tahoe, USA, pp.1097-1105, 2012.
    K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition", arXiv reprint arXiv:1409.1556, 2014.
    K. He, X. Zhang, S. Ren, et al., "Deep residual learning for image recognition", Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, pp.770-778, 2016.
    G. Huang, Z. Liu, L. V. D. Maaten, et al., "Densely connected convolutional networks", Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hawaii, USA, pp.4700-4708, 2017.
    C. Szegedy, W. Liu, Y. Jia, et al., "Going deeper with convolutions", Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, USA, pp.1-9, 2015.
    F. Gao, M. Wang, J. Wang. Dally, et al., "A novel separability objective function in CNN for feature extraction of SAR images", Chinese Journal of Electronics, Vol.28, No.2, pp.423-429, 2019.
    M. Wang, B. Liu and H. Foroosh, "Design of efficient convolutional layers using single intra-channel convolution, topological subdivisioning and spatial "bottleneck" structure", arXiv preprint arXiv:1608.04337, 2016.
    A. G. Howard, M. Zhu, B. Chen, et al., "Mobilenets:Efficient convolutional neural networks for mobile vision applications", arXiv preprint arXiv:1704.04861, 2017.
    S. Lin, R. Ji, C. Yan, et al., "Towards optimal structured cnn pruning via generative adversarial learning", Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, USA, pp.2790-2799, 2019.
    S. Xie, A. Kirillov, R. Girshick, et al., "Exploring randomly wired neural networks for image recognition", Proceedings of the IEEE International Conference on Computer Vision, Seoul, Korea, pp.2790-2799, 2019.
    J. Bergstra and Y. Bengio, "Algorithms for hyper-parameter optimization", Advances in Neural Information Processing Systems, Granada, Spain, pp.2546-2554, 2011.
    J. Bergstra, D. L. K. Yamins and D. D. Cox, "Making a science of model search:Hyperparameter optimization in hundreds of dimensions for vision architectures", JMLR, pp.115-123, 2013.
    X. Dai, H. Yin and N. K. Jha, "NeST:A neural network synthesis tool based on a grow-and-prune paradigm", IEEE Transactions on Computers, Vol.68, No.10, pp.1487-1497, 2019.
    M. Suganuma, S. Shirakawa and T. Nagao, "A genetic programming approach to designing convolutional neural network architectures", Proceedings of the Genetic and Evolutionary Computation Conference, Berlin, Germany, pp.497-504, 2017.
    E. Dufourq and B. A. Bassett, "Eden:Evolutionary deep networks for efficient machine learning", 2017 Pattern Recognition Association of South Africa and Robotics and Mechatronics, Bloemfontein, South Africa, pp.110-115, 2017.
    A. Chung, M. J. Shafiee, P. Fieguth, et al., "The mating rituals of deep neural networks:Learning compact feature representations through sexual evolutionary synthesis", Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, pp.1220-1227, 2017.
    P. R. Lorenzo, J. Nalepa and L. S. Ramos, "Hyperparameter selection in deep neural networks using parallel particle swarm optimization", Proceedings of the Genetic and Evolutionary Computation Conference Companion, Berlin, Germany, pp.1864-1871, 2017.
    L. Xie and A. Yuille, "Genetic CNN", Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, pp.1379-1388, 2017.
    E. Real, A. Aggarwal, Y. Huang, et al., "Regularized evolution for image classifier architecture search", Proceedings of the Association for the Advance of Artificial Intelligence, Hawaii, USA, pp.4780-4789, 2019.
    H. Liu, K. Simonyan, O. Vinyals, et al., "Hierarchical representations for efficient architecture search", arXiv preprint arXiv:1711.00436, 2017.
    M. Gong, J. Liu, H. Li, et al., "A multiobjective sparse feature learning model for deep neural networks", IEEE Transactions on Neural Networks and Learning Systems, Vol.26, No.12, pp.3263-3277, 2015.
    K. Deb, A. Pratap, S. Agarwal, et al., "A fast and elitist multiobjective genetic algorithm:NSGA-ii", IEEE Transactions on Evolutionary Computation, Vol.6, No.2, pp.182-197, 2002.
    A. Krizhevsky and G. Hinton, Learning Multiple Layers of Features from Tiny Images, Citeseer, Princeton, USA, http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.222.9220&rep=rep1&type=pdf, 2009.
    H. Xiao, K. Rasul and R. Vollgraf, "Fashion-mnist:A novel image dataset for benchmarking machine learning algorithms", arXiv preprint arXiv:1708.07747, 2017.
    Y. Netzer, T. Wang, A. Coates, et al., "Reading digits in natural images with unsupervised feature learning", NIPS Workshop on Deep Learning and Unsupervised Feature Learning, Vol.2011, No.2, Page 5, 2011.
    B. Zoph and Q. V. Le, "Neural architecture search with reinforcement learning", arXiv preprint arXiv:1611.01578, 2016.
    M. Lin, Q. Chen and S. Yan, "Network in network", arXiv preprint arXiv:1312.4400, 2013.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (909) PDF downloads(65) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return