JISE


  [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22]


Journal of Information Science and Engineering, Vol. 31 No. 2, pp. 627-640


A Novel Twin Support Vector Regression


YITIAN XU 
College of Science 
China Agricultural University 
Beijing, 100083 P.R. China 
E-mail: xytshuxue@126.com


    Twin support vector regression (TSVR), as an effective regression machine, solves a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one as in the classical support vector regression (SVR), which makes the learning speed of TSVR approximately 4 times faster than that of the conventional SVR. However, the empirical risk minimization principle is implemented in TSVR, which reduces its generalization ability to a certain extent. In order to improve the prediction accuracy and stability of algorithm, we propose a novel TSVR for the regression problem by introducing a regularization term into the objective function, which ensures the new algorithm implements the structural risk minimization principle instead of the empirical risk minimization principle. Moreover, the up- and down-bound functions obtained in our algorithm are as parallel as possible. Thus it ensures that our proposed algorithm yields lower prediction error and lower standard deviation in theory. The experimental results on one artificial dataset and six benchmark datasets indicate the feasibility and validity of our novel TSVR.


Keywords: SVR, TSVR, novel TSVR, up- and down-bound functions, structural risk minimization

  Retrieve PDF document (JISE_201502_15.pdf)