Evaluating the performance of deep supervised auto encoder in single sample face recognition problem using Kullback-Leibler Divergence sparsity regularizer

Otniel Y. Viktorisa, Ito Wasito, Arida F. Syafiandini

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Recent development on supervised auto encoder research gives promising solutions toward single sample face recognition problems. In this research, Kullback-Leibler Divergence (KLD) approach is proposed to obtain penalty of sparsity constraint for deep auto encoder learning process. This approach is tested using two datasets, Extended Yale B (cropped version) and LFWcrop. For comparison, Log and εL1also employed as sparsity regularizers. Experiment results confirm that KLD has better performance in image classification of both datasets compared to Log and εL1.

Original languageEnglish
Pages (from-to)255-258
Number of pages4
JournalJournal of Theoretical and Applied Information Technology
Volume87
Issue number2
Publication statusPublished - May 2016

Keywords

  • Deep auto encoder
  • Kullback-Leibler Divergence
  • Single sample face recognition
  • Sparsity
  • Sparsity regularizer

Fingerprint

Dive into the research topics of 'Evaluating the performance of deep supervised auto encoder in single sample face recognition problem using Kullback-Leibler Divergence sparsity regularizer'. Together they form a unique fingerprint.

Cite this