Simulated Annealing Algorithm for Deep Learning

Research output: Contribution to journalConference articlepeer-review

50 Citations (Scopus)

Abstract

Deep learning (DL) is a new area of research in machine learning, in which the objective is moving us closer to the goal of artificial intelligent. This method can learn many levels of abstraction and representation to create a common sense of data such as text, sound and image. Although DL is useful for a variety of tasks, it's hard to train. Some methods in training deep learning to make it optimal have been proposed, including Stochastic Gradient Descent, Conjugate Gradient, Hessian-free optimization, and Krylov Subspace Descent. In this paper, we proposed Simulated Annealing (SA) to improve the performance of Convolution Neural Network (CNN), as an alternative approach for optimal DL using modern optimization technique, i.e. metaheuristic algorithm. MNIST dataset is used to ensure the accuracy and efficiency of the proposed method. Moreover, we also compare our proposed method with the original of CNN. Although there is an increase in computation time, the experiment results show that the proposed method can improve the performance of original CNN.

Original languageEnglish
Pages (from-to)137-144
Number of pages8
JournalProcedia Computer Science
Volume72
DOIs
Publication statusPublished - 1 Jan 2015
Event3rd Information Systems International Conference, 2015 - Shenzhen, China
Duration: 16 Apr 201518 Apr 2015

Keywords

  • Deep Learning
  • Optimization
  • Simulated Annealing algorithm

Fingerprint

Dive into the research topics of 'Simulated Annealing Algorithm for Deep Learning'. Together they form a unique fingerprint.

Cite this