검색 상세

On a nonlinear extension of the principal fitted component model

초록/요약

We propose a nonlinear sufficient dimension reduction method called the kernel principal fitted component model using the kernel method under a reproducing kernel Hilbert space. The kernel principal fitted component is a nonlinear extension of the principal fitted component model, and it is found in the theory of mapping low dimensional input space to the higher dimensional feature space so that we can apply well-developed linear methods to the nonlinear dataset. We derive our method coincides with the generalized sliced inverse regression under some mild assumptions and show the dimension reduction subspace extracted from the kernel principal fitted component model is contained in the central class. In the numerical experiments, we present the kernel principal fitted component model with the Gaussian kernel can extract the linear and nonlinear features well for the models from both forward and inverse regression settings. By applying our method to ovarian cancer microarray dataset, we demonstrate the kernel principal fitted component can provide a competitive prediction accuracy and computational efficiency in the high-dimensional classification problem. © 2023 Elsevier B.V.

more