An Out-of-Sample Extension to Manifold Learning via Meta-Modeling


Taşkın Kaya G., Crawford M. M.

IEEE TRANSACTIONS ON IMAGE PROCESSING, cilt.28, sa.10, ss.5227-5237, 2019 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 28 Sayı: 10
  • Basım Tarihi: 2019
  • Doi Numarası: 10.1109/tip.2019.2915162
  • Dergi Adı: IEEE TRANSACTIONS ON IMAGE PROCESSING
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.5227-5237
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

Unsupervised manifold learning has become accepted as an important tool for reducing dimensionality of a dataset by finding its meaningful low-dimensional representation lying on an unknown nonlinear subspace. Most manifold learning methods only embed an existing dataset but do not provide an explicit mapping function for novel out-of-sample data, thereby potentially resulting in an ineffective tool for classification purposes, particularly for iterative methods, such as active learning. To address this issue, out-of-sample extension methods have been introduced to generalize an existing embedding of new samples. In this paper, a novel out-of-sample method is introduced by utilizing high dimensional model representation (HDMR) as a nonlinear multivariate regression with the Tikhonov regularizer for unsupervised manifold learning algorithms. The proposed method was extensively analyzed using illustrative datasets sampled from known manifolds. Several experiments with 3D synthetic datasets and face recognition datasets were also conducted, and the performance of the proposed method was compared to several well-known out-of-sample methods. The results obtained with locally linear embedding (LLE), Laplacian Eigenmaps (LE), and t-distributed stochastic neighbor embedding (t-SNE) showed that the proposed method achieves competitive even better performance than the other out-of-sample methods.