TY - JOUR
T1 - Simulated Annealing Algorithm for Deep Learning
AU - Rere, L. M.Rasdi
AU - Fanany, Mohamad Ivan
AU - Arymurthy, Aniati Murni
N1 - Funding Information:
This work is supported by Higher Education Center of Excellence Research Grant funded by Indonesia Ministry of Research, Technology and Higher Education. Contract No. 0475/UN2.R12/HKP.05.00/2015.
Publisher Copyright:
© 2015 The Authors.
PY - 2015
Y1 - 2015
N2 - Deep learning (DL) is a new area of research in machine learning, in which the objective is moving us closer to the goal of artificial intelligent. This method can learn many levels of abstraction and representation to create a common sense of data such as text, sound and image. Although DL is useful for a variety of tasks, it's hard to train. Some methods in training deep learning to make it optimal have been proposed, including Stochastic Gradient Descent, Conjugate Gradient, Hessian-free optimization, and Krylov Subspace Descent. In this paper, we proposed Simulated Annealing (SA) to improve the performance of Convolution Neural Network (CNN), as an alternative approach for optimal DL using modern optimization technique, i.e. metaheuristic algorithm. MNIST dataset is used to ensure the accuracy and efficiency of the proposed method. Moreover, we also compare our proposed method with the original of CNN. Although there is an increase in computation time, the experiment results show that the proposed method can improve the performance of original CNN.
AB - Deep learning (DL) is a new area of research in machine learning, in which the objective is moving us closer to the goal of artificial intelligent. This method can learn many levels of abstraction and representation to create a common sense of data such as text, sound and image. Although DL is useful for a variety of tasks, it's hard to train. Some methods in training deep learning to make it optimal have been proposed, including Stochastic Gradient Descent, Conjugate Gradient, Hessian-free optimization, and Krylov Subspace Descent. In this paper, we proposed Simulated Annealing (SA) to improve the performance of Convolution Neural Network (CNN), as an alternative approach for optimal DL using modern optimization technique, i.e. metaheuristic algorithm. MNIST dataset is used to ensure the accuracy and efficiency of the proposed method. Moreover, we also compare our proposed method with the original of CNN. Although there is an increase in computation time, the experiment results show that the proposed method can improve the performance of original CNN.
KW - Deep Learning
KW - Optimization
KW - Simulated Annealing algorithm
UR - http://www.scopus.com/inward/record.url?scp=84964022165&partnerID=8YFLogxK
U2 - 10.1016/j.procs.2015.12.114
DO - 10.1016/j.procs.2015.12.114
M3 - Conference article
AN - SCOPUS:84964022165
SN - 1877-0509
VL - 72
SP - 137
EP - 144
JO - Procedia Computer Science
JF - Procedia Computer Science
T2 - 3rd Information Systems International Conference, 2015
Y2 - 16 April 2015 through 18 April 2015
ER -