Boosting RBFNN performance in regression tasks with quantum kernel methods
Quantum and classical machine learning are fundamentally connected through kernel methods, with kernels serving as inner products of feature vectors in high-dimensional spaces, forming their foundation. Among commonly used kernels, the Gaussian kernel plays a prominent role in radial basis function neural network (RBFNN) for regression tasks. Nonetheless, the localized response property of the Gaussian kernel, which emphasizes relationships between nearby data points, limits its capacity to model interactions among more distant data points. As a result, it may potentially overlook the broader structural dependencies present within the dataset. In contrast, quantum kernels are commonly evaluated by explicitly generating quantum states and computing their inner products, thus leveraging additional quantum dimensions and capturing more intricate and complex data patterns. With the motivation of overcoming the problem above, we develop a hybrid quantum–classical model, called quantum kernel-based feedforward neural network (QKFNN) by leveraging quantum kernel methods (QKMs) to improve the prediction accuracy of RBFNN. In this study, we begin with a comprehensive introduction to QKMs, after which we present the architecture of QKFNN. To further refine model performance, an optimization strategy based on the general unitary transformation that involves a rotation factor is employed to obtain an optimized quantum kernel. The effectiveness of QKFNN is validated through experiments on synthetic and real-world datasets.