Object based convolutional neural network for cloud classification in very high-resolution hyperspectral imagery

R. Rizkiyanto, B. Rabbani, D. Y. Perwira, A. M. Arymurthy

Research output: Contribution to journalConference articlepeer-review

Abstract

Remote sensing has a critical role for spatial data-based information systems and monitoring of the earth's surface. The presence of clouds in optical sensing remote sensing satellite images is often a problem for many remote sensing applications. Therefore, the proper detection and classification of clouds in optical sensor remote sensing applications is quite a challenging task. This study aims to classify cloud objects in remote sensing satellite image data. The data used in this study is Pleiades very high-resolution satellite imagery data. The number of datasets used amounted to 1299 data. Cloud objects in this study are categorized into three classes, namely thick cloud, thin cloud, and clear. This study uses a deep learning algorithm, Convolutional Neural Network (CNN) for the classification of cloud objects. The CNN model used is LeNet with architectural modifications and parameters adjusted to the research needs. Classification of cloud objects with the LeNet model results in increased accuracy in each epoch during the training process and takes 1150.355 seconds for 200 epochs with the best accuracy value of 97.50%. The performance of LeNet is better than the VGG16 model as a comparison with the best accuracy of 96.50% with 600 data inputs.

Original languageEnglish
Article number012059
JournalIOP Conference Series: Earth and Environmental Science
Volume500
Issue number1
DOIs
Publication statusPublished - 3 Jul 2020
Event5th International Conferences of Indonesian Society for Remote Sensing, ICOIRS 2019 and and Indonesian Society for Remote Sensing Congress - Bandung, West Java, Indonesia
Duration: 17 Sept 201920 Sept 2019

Fingerprint

Dive into the research topics of 'Object based convolutional neural network for cloud classification in very high-resolution hyperspectral imagery'. Together they form a unique fingerprint.

Cite this