Turn off MathJax
Article Contents
Ling LIU, Maoxiang CHU, Rongfen GONG, et al., “Weighted Linear Loss Large Margin Distribution Machine for Pattern Classification,” Chinese Journal of Electronics, vol. 33, no. 3, pp. 1–13, 2024 doi: 10.23919/cje.2022.00.156
Citation: Ling LIU, Maoxiang CHU, Rongfen GONG, et al., “Weighted Linear Loss Large Margin Distribution Machine for Pattern Classification,” Chinese Journal of Electronics, vol. 33, no. 3, pp. 1–13, 2024 doi: 10.23919/cje.2022.00.156

Weighted Linear Loss Large Margin Distribution Machine for Pattern Classification

doi: 10.23919/cje.2022.00.156
More Information
  • Author Bio:

    Ling LIU was born in 1998. She received the B.S. degree in measurement and control technology and instruments from University of Science and Technology Liaoning, China, in 2021. She is currently working towards the M.S. degree in University of Science and Technology Liaoning, China. Her current research interests include pattern recognition and machine learning. (Email: ll15566271785@163.com)

    Maoxiang CHU was born in 1978. He is with the School of Electronic and Information Engineering in University of Science and Technology Liaoning. He received Ph.D. degree from pattern recognition and intelligent systems from Northeastern University in 2015. His current research interests include pattern recognition, machine learning, image processing, especially the pattern clasification. (Email: chu522004@163.com)

    Rongfen GONG was born in 1979. She is with the School of Electronic and Information Engineering in University of Science and Technology Liaoning, Anshan, China. She received the Ph.D. degree in pattern recognition and intelligent systems from Northeastern University, Shenyang, China, in 2020. Her current research interests include pattern recognition and machine learning. (Email: fx_gong@hotmail.com)

    Liming LIU was born in 1994. She received the M.S. degree in control science and engineering from University of Science and Technology Liaoning in 2019. She is a Ph.D. candidate at School of Electronic and Information Engineering in University of Science and Technology Liaoning. Her current research interests include pattern recognition and machine learning. (Email: llm06101021@hotmail.com)

    Yonghui YANG was born in 1971. He received the Ph.D. degree from University of Science and Technology Liaoning in 2018. Now he is an Associate Professor and doctoral supervisor at School of Electronic and Information Engineering in University of Science and Technology Liaoning. His research interests include intelligent control, pattern recognition and machine learning. (Email: yangyh2636688@163.com)

  • Corresponding author: Email: fx_gong@hotmail.com
  • Received Date: 2022-05-31
  • Accepted Date: 2023-08-07
  • Available Online: 2023-11-20
  • Compared with support vector machine, large margin distribution machine (LDM) has better generalization performance. The central idea of LDM is to maximize the margin mean and minimize the margin variance simultaneously. But the computational complexity of LDM is high. In order to reduce the computational complexity of LDM, a weighted linear loss LDM (WLLDM) is proposed. The framework of WLLDM is built based on LDM and the weighted linear loss. The weighted linear loss is adopted instead of the hinge loss in WLLDM. This modification can transform the quadratic programming problem into a simple linear equation, resulting in lower computational complexity. Thus, WLLDM has the potential to deal with large-scale datasets. The WLLDM is similar in principle to the LDM algorithm, which can optimize the margin distribution and achieve better generalization performance. The WLLDM algorithm is compared with other models by conducting experiments on different datasets. The experimental results show that the proposed WLLDM has better generalization performance and faster training speed.
  • loading
  • [1]
    V. N. Vapnik, “An overview of statistical learning theory,” IEEE Transactions on Neural Networks, vol. 10, no. 5, pp. 988–999, 1999. doi: 10.1109/72.788640
    [2]
    A. Dixit, A. Mani, and R. Bansal, “CoV2-Detect-Net: Design of COVID-19 prediction model based on hybrid DE-PSO with SVM using chest X-ray images,” Information Sciences, vol. 571, pp. 676–692, 2021. doi: 10.1016/j.ins.2021.03.062
    [3]
    P. Negri, S. Cumani, and A. Bottino, “Tackling age-invariant face recognition with non-linear PLDA and pairwise SVM,” IEEE Access, vol. 9, pp. 40649–40664, 2021. doi: 10.1109/ACCESS.2021.3063819
    [4]
    M. B. Abidine and B. Fergani, “Activity recognition from smartphone data using weighted learning methods,” Intelligenza Artificiale, vol. 15, no. 1, pp. 1–15, 2021. doi: 10.3233/IA-200059
    [5]
    S. Mehrkanoon, X. L. Huang, and J. A. K. Suykens, “Non-parallel support vector classifiers with different loss functions,” Neurocomputing, vol. 143, pp. 294–301, 2014. doi: 10.1016/j.neucom.2014.05.063
    [6]
    C. F. Lin and S. D. Wang, “Fuzzy support vector machines,” IEEE Transactions on Neural Networks, vol. 13, no. 2, pp. 464–471, 2002. doi: 10.1109/72.991432
    [7]
    L. Tang, Y. J. Tian, and P. M. Pardalos, “A novel perspective on multiclass classification: Regular simplex support vector machine,” Information Sciences, vol. 480, pp. 324–338, 2019. doi: 10.1016/j.ins.2018.12.026
    [8]
    X. L. Yang, Q. Song, and Y. Wang, “A weighted support vector machine for data classification,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 21, no. 5, pp. 961–976, 2007. doi: 10.1142/S0218001407005703
    [9]
    L. Reyzin and R. E. Schapire, “How boosting the margin can also boost classifier complexity,” in Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh, PA, USA, pp. 753–760, 2006.
    [10]
    W. Gao and Z. H. Zhou, “On the doubt about margin explanation of boosting,” Artificial Intelligence, vol. 203, pp. 1–18, 2013. doi: 10.1016/j.artint.2013.07.002
    [11]
    Z. H. Zhou, “Large margin distribution learning,” in Proceedings of the 6th IAPR Workshop on Artificial Neural Networks in Pattern Recognition, Montreal, Canada, pp. 1–11, 2014.
    [12]
    T. Zhang and Z. H. Zhou, “Large margin distribution machine,” in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, pp. 313–322, 2014.
    [13]
    L. M. Liu, M. X. Chu, R. F. Gong, et al., “Nonparallel support vector machine with large margin distribution for pattern classification,” Pattern Recognition, vol. 106, pp. 107374, 2020. doi: 10.1016/j.patcog.2020.107374
    [14]
    F. Y. Cheng, J. Zhang, Z. Y. Li, et al., “Double distribution support vector machine,” Pattern Recognition Letters, vol. 88, pp. 20–25, 2017. doi: 10.1016/j.patrec.2017.01.010
    [15]
    U. Gupta and D. Gupta, “Least squares large margin distribution machine for regression,” Applied Intelligence, vol. 51, no. 10, pp. 7058–7093, 2021. doi: 10.1007/s10489-020-02166-5
    [16]
    J. A. K. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Processing Letters, vol. 9, no. 3, pp. 293–300, 1999. doi: 10.1023/A:1018628609742
    [17]
    Jayadeva, R. Khemchandani, and S. Chandra, “Twin support vector machines for pattern classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 5, pp. 905–910, 2007. doi: 10.1109/TPAMI.2007.1068
    [18]
    Y. H. Shao, Z. Wang, Z. M. Yang, et al., “Weighted linear loss support vector machine for large scale problems,” Procedia Computer Science, vol. 31, pp. 639–647, 2014. doi: 10.1016/j.procs.2014.05.311
    [19]
    Y. H. Shao, W. J. Chen, Z. Wang, et al., “Weighted linear loss twin support vector machine for large-scale classification,” Knowledge-Based Systems, vol. 73, pp. 276–288, 2015. doi: 10.1016/j.knosys.2014.10.011
    [20]
    S. Abe, “Unconstrained large margin distribution machines,” Pattern Recognition Letters, vol. 98, pp. 96–102, 2017. doi: 10.1016/j.patrec.2017.09.005
    [21]
    B. Schölkopf and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge, MA, USA, 2002.
    [22]
    T. Zhang and Z. H. Zhou, “Optimal margin distribution machine,” IEEE Transactions on Knowledge and Data Engineering, vol. 32, no. 6, pp. 1143–1156, 2020. doi: 10.1109/TKDE.2019.2897662
    [23]
    D. Dua and C. Graff, “UCI machine learning repository,” Available at: http://archive. ics. uci. edu/ml.
    [24]
    D. R. Musicant, “NDC: Normally distributed clustered datasets,” Available at: http://www.cs.wisc.edu/dmi/svm/ ndc.
    [25]
    Y. Q. Bao, K. C. Song, J. Liu, et al., “Triplet-graph reasoning network for few-shot metal generic surface defect segmentation,” IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 5011111, 2021. doi: 10.1109/TIM.2021.3083561
    [26]
    Y. He, K. C. Song, Q. G. Meng, et al., “An end-to-end steel surface defect detection approach via fusing multiple hierarchical features,” IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 4, pp. 1493–1504, 2020. doi: 10.1109/TIM.2019.2915404
    [27]
    B. Fei and J. B. Liu, “Binary tree of SVM: A new fast multiclass training and classification algorithm,” IEEE Transactions on Neural Networks, vol. 17, no. 3, pp. 696–704, 2006. doi: 10.1109/TNN.2006.872343
    [28]
    J. Liu and Y. B. Xu, “T-friedman test: A new statistical test for multiple comparison with an adjustable conservativeness measure,” International Journal of Computational Intelligence Systems, vol. 15, no. 1, pp. 29, 2022. doi: 10.1007/s44196-022-00083-8
    [29]
    A. Benavoli, G. Corani, and F. Mangili, “Should we really use post-hoc tests based on mean-ranks?,” The Journal of Machine Learning Research, vol. 17, no. 1, pp. 152–161, 2016.
    [30]
    C. H. Lin, J. S. Tsai, and C. T. Chiu, “Switching bilateral filter with a texture/noise detector for universal noise removal,” IEEE Transactions on Image Processing, vol. 19, no. 9, pp. 2307–2320, 2010. doi: 10.1109/TIP.2010.2047906
    [31]
    Z. Y. He and L. N. Sun, “Surface defect detection method for glass substrate using improved Otsu segmentation,” Applied Optics, vol. 54, no. 33, pp. 9823–9830, 2015. doi: 10.1364/AO.54.009823
    [32]
    M. X. Chu, R. F. Gong, S. Gao, et al., “Steel surface defects recognition based on multi-type statistical features and enhanced twin support vector machine,” Chemometrics and Intelligent Laboratory Systems, vol. 171, pp. 140–150, 2017. doi: 10.1016/j.chemolab.2017.10.020
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(4)  / Tables(11)

    Article Metrics

    Article views (116) PDF downloads(18) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return