Graph Embedding via High Dimensional Model Representation for Hyperspectral Images

Creative Commons License

Taşkın Kaya G. , Camps-Valls G.

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, vol.60, 2022 (Journal Indexed in SCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 60
  • Publication Date: 2022
  • Doi Number: 10.1109/tgrs.2021.3133957
  • Keywords: Hyperspectral imaging, Manifold learning, Germanium, Manifolds, Kernel, Laplace equations, Eigenvalues and eigenfunctions, Dimensionality reduction (DR), feature extraction, hyperspectral image (HSI) classification, manifold learning, manifold, out-of-sample (OOS) problem, spectral embedding, DISCRIMINANT-ANALYSIS, CLASSIFICATION, REDUCTION, EXTENSIONS, EIGENMAPS, FRAMEWORK


Learning the manifold structure of remote sensing images is of paramount relevance for modeling and understanding processes, as well as encapsulating the high dimensionality in a reduced set of informative features for subsequent classification, regression, or unmixing. Manifold learning methods have shown excellent performance when dealing with hyperspectral image (HSI) analysis, but, unless specifically designed, they cannot provide an explicit embedding map readily applicable to out-of-sample (OOS) data. A common assumption to deal with the problem is that the transformation between the high-dimensional input space and the latent space (typically low) is linear. This is a particularly strong assumption, especially when dealing with HSIs due to the well-known nonlinear nature of the data. To address this problem, a manifold learning method based on high-dimensional model representation (HDMR) is proposed, which enables a nonlinear embedding function to project OOS samples into the latent space. The proposed method is compared to manifold learning methods along with their linear counterparts and achieves promising performance in terms of classification accuracy for a representative set of HSIs.