Communications on Pure & Applied Analysis, volume 19, issue 8, pages 4069-4083
Quantitative convergence analysis of kernel based large-margin unified machines
Jun Fan
1
,
Dao-Hong Xiang
2
Publication type: Journal Article
Publication date: 2020-06-01
scimago Q2
SJR: 0.744
CiteScore: 1.9
Impact factor: 1
ISSN: 15340392, 15535258
General Medicine
Applied Mathematics
Analysis
Abstract
High-dimensional binary classification has been intensively studied in the community of machine learning in the last few decades. Support vector machine (SVM), one of the most popular classifier, depends on only a portion of training samples called support vectors which leads to suboptimal performance in the setting of high dimension and low sample size (HDLSS). Large-margin unified machines (LUMs) are a family of margin-based classifiers proposed to solve the so-called 'data piling' problem which is inherent in SVM under HDLSS settings. In this paper we study the binary classification algorithms associated with LUM loss functions in the framework of reproducing kernel Hilbert spaces. Quantitative convergence analysis has been carried out for these algorithms by means of a novel application of projection operators to overcome the technical difficulty. The rates are explicitly derived under priori conditions on approximation and capacity of the reproducing kernel Hilbert space.
Found
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.