JISE


  [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17]


Journal of Information Science and Engineering, Vol. 34 No. 1, pp. 103-121


A Fast and Automatic Kernel-based Classification Scheme: GDA+SVM or KNWFE+SVM


CHENG-HSUAN LI1, PEI-JYUN HSIEN1 AND LI-HUI LIN2,3
1Graduate Institute of Educational Information and Measurement
National Taichung University of Education
Taichung, 40306 Taiwan

2College of Mathematics and Computer Science  
Wuyi University
Wuyishan, 354300 P.R. China  

3The Key Laboratory of Cognitive Computing and Intelligent Information Processing
Fujian Education Institutions
Wuyishan, 354300 P.R. China
E-mail: {chenghsuanli; pp110349}@gmail.com; 6390984@qq.com 


    For high-dimensional data classification such as hyperspectral image classification, feature extraction is a crucial pre-process for avoiding the Hughes phenomena. Some feature extraction methods such as linear discriminant analysis (LDA), nonparametric weighted feature extraction (NWFE), and their kernel versions, generalized discriminant analysis (GDA) and kernel nonparametric weighted feature extraction method (KNWFE) have been shown that they can improve the classification performance. However, there are two challenges which influence the classification performance by applying GDA or KNWFE. First one is the solution of the generalized eigenvalue problem formed by “implicit” within- and between-class scatter matrices. The other one is the appropriate selection of the kernel parameter(s). Therefore, researchers rarely implement them for dealing with high-dimensional data classification. Recently, an automatic kernel parameter selection method (APS) was proposed to predetermine the appropriate RBF kernel for support vector machine (SVM) instead of the transitional cross-validation method. In this study, a theoretical procedure to solve the implicit generalized eigenvalue problem was proposed. Moreover, APS was applied to find the suitable RBF kernel parameter of GDA and KNWFE. Combing with kernel-based classifier, SVM, a fast and automatically kernel- based classification scheme, GDA+SVM or KNWFE+SVM, was also brought out. From the experiment results on real data sets, the classification performance of GDA+ SVM or KNWFE+SVM outperforms SVM with whole features, especially in the small sample size problem. Most importantly, the readership can extend any feature extraction methods based on within- and between-class scatter matrices. In addition, the researcher can implement them directly without tuning the kernel parameter.     


Keywords: kernel method, feature extraction, variable selection, GDA, KNWFE

  Retrieve PDF document (JISE_201801_07.pdf)