WANG Tinghua, ZHAO Dongyan, LIU Fulai, “An Efficient Kernel Evaluation Criterion for Multiclass Classification,” Chinese Journal of Electronics, vol. 22, no. 2, pp. 219-224, 2013,
Citation: WANG Tinghua, ZHAO Dongyan, LIU Fulai, “An Efficient Kernel Evaluation Criterion for Multiclass Classification,” Chinese Journal of Electronics, vol. 22, no. 2, pp. 219-224, 2013,

An Efficient Kernel Evaluation Criterion for Multiclass Classification

Funds:  This work is supported by the National High Technology Research and Development Program of China (863 Program) (No.2012AA011101), the Natural Science Foundation of Jiangxi Province of China (No.20114BAB211021) and the Science and Technology Program Foundation of Jiangxi Education Committee of China (No.GJJ12580).
  • Received Date: 2011-12-01
  • Rev Recd Date: 2012-04-01
  • Publish Date: 2013-04-25
  • Kernel is a key component of the Support vector machines (SVMs) and other kernel methods. Based on the data distributions of classes in feature space, we proposed a kernel selection criterion named Kernel distancebased class separability (KDCS) to evaluate the goodness of a kernel in multiclass classification scenario. KDCS is differentiable with respect to the kernel parameters, thus the gradient-based optimization technique can be used to find the best model efficiently. In addition, it does not need to put a part of training samples aside for validation and makes full use of all the training samples available. The relationship between this criterion and kernel polarization was also explored. Compared with the 10-fold cross validation technique which is often regarded as a benchmark, this criterion is found to yield about the same performance as exhaustive parameter search.
  • loading
  • J. Shawe-Taylor, N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, Cambridge, UK, 2004.
    H. Zhou, D. Huang, X. Li, Y. Yang, "Combining structured and flat features by a composite kernel to detect hedges scope in biological texts", Chinese Journal of Electronics, Vol.20, No.3, pp.476-482, 2011.
    V. Vapnik, The Nature of Statistical Learning Theory, Springer Verlag, New York, USA, 1995.
    O. Chapelle, V. Vapnik, O. Bousquet, S. Mukherjee, "Choosing multiple parameters for support vector machines", Machine Learning, Vol.46, No.1, pp.131-159, 2002.
    K.B. Duan, S.S. Keerthi, A.N. Poo, "Evaluation of simple performance measures for tuning SVM hyperparameters", Neurocomputing, Vol.51, No.4, pp.41-59, 2003.
    N. Cristianini, J. Shawe-Taylor, A. Elisseeff, J. Kandola, "On kernel-target alignment" Proc. of Neural Information Processing Systems: Natural and Synthetic, Vancouver, British Columbia, Canada, pp.367-373, 2001.
    C. Cortes, M. Mohri, A. Rostamizadeh, "Two-stage learning kernel algorithms", Proc. of the 27th International Conference on Machine Learning, Haifa, Israel, pp.239-246, 2010.
    Y. Baram, "Learning by kernel polarization", Neural Computation, Vol.17, No.6, pp.1264-1275, 2005.
    T. Wang, S. Tian, H. Huang, D. Deng, "Learning by local kernel polarization", Neurocomputing, Vol.72, No.13-15, pp.3077- 3084, 2009.
    T. Wang, H. Huang, S. Tian, D. Deng, "Learning general Gaussian kernels by optimizing kernel polarization", Chinese Journal of Electronics, Vol.18, No.2, pp.265-269, 2009.
    C.H. Nguyen, T.B. Ho, "Kernel matrix evaluation", Proc. of the 20th International Joint Conference on Artificial Intelligence, Hyderabad, India, pp.987-992, 2007.
    Y. Lee, Y. Lin, G. Wahba, "Multicategory support vector machines: theory and application to the classification of microarray data and satellite radiance data", Journal of the American Statistical Association, Vol.99, No.465, pp.67-81, 2004.
    A. Passerini, M. Pontil, P. Frasconi, "New results on error correcting output codes of kernel machines", IEEE Transactions on Neural Networks, Vol.15, No.1, pp.45-54, 2004.
    B.F. de Souza, A.C.P.L.F. de Carvalho, R. Calvo, R.P. Ishii, "Multiclass SVM model selection using particle swarm optimization", Proc. of the 6th International Conference on Hybrid Intelligent Systems, Rio de Janeiro, Brazil, pp.31-34, 2006.
    A.C. Lorena, A. Carvalho, "Evolutionary tuning of SVM parameter values in multiclass problems", Neurocomputing, Vol.71, No.16-18, pp.3326-3334, 2008.
    L. Wang, P. Xue, K.L. Chan, "Two criteria for model selection in multiclass support vector machines", IEEE Transactions on System, Man, and Cybernetics-Part B: Cybernetics, Vol.38, No.6, pp.1432-1448, 2008.
    C.W. Hsu, C.J. Lin, "A comparison of methods for multiclass support vector machines", IEEE Transactions on Neural Networks, Vol.13, No.2, pp.415-425, 2002.
    R. Rifkin, A. Klautau, "In defense of one-vs-all classification", Journal of Machine Learning Research, Vol.5, No.1, pp.101- 141, 2004.
    L. Wang, "Feature selection with kernel class separability", IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.30, No.9, pp.1534-1546, 2008.
    M. Sugiyama, "Local Fisher discriminant analysis for supervised dimensionality reduction", Proc. of the 23rd International Conference on Machine Learning, Pittsburgh, Pennsylvania, USA, pp.905-912, 2006.
    A. Frank, A. Asuncion, UCI Machine Learning Repository [http://archive.ics.uci.edu/ml/], Irvine, CA: University of California, School of Information and Computer Science, 2011.
    C.C. Chang, C.J. Lin, LIBSVM: A Library for Support Vector Machines, Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm, 2011.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (691) PDF downloads(1724) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return