Hierarchical attention network with XGBoost for recognizing insufficiently supported argument

Derwin Suhartono, Aryo Pradipta Gema, Suhendro Winton, Theodorus David, Mohamad Ivan Fanany, Aniati Murni Arymurthy

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

In this paper, we propose the empirical analysis of Hierarchical Attention Network (HAN) as a feature extractor that works conjointly with eXtreme Gradient Boosting (XGBoost) as the classifier to recognize insufficiently supported arguments using a publicly available dataset. Besides HAN + XGBoost, we performed experiments with several other deep learning models, such as Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), bidirectional LSTM, and bidirectional GRU. All results with the best hyper-parameters are presented. In this paper, we present the following three key findings: (1) Shallow models work significantly better than the deep models when using only a small dataset. (2) Attention mechanism can improve the deep model’s result. In average, it improves Area Under the Receiver Operating Characteristic Curve (ROC-AUC) score of Recurrent Neural Network (RNN) with a margin of 18.94%. The hierarchical attention network gave a higher ROC-AUC score by 2.25% in comparison to the non-hierarchical one. (3) The use of XGBoost as the replacement for the last fully connected layer improved the F1 macro score by 5.26%. Overall our best setting achieves 1.88% improvement compared to the state-of-the-art result.

Original languageEnglish
Title of host publicationMulti-disciplinary Trends in Artificial Intelligence - 11th International Workshop, MIWAI 2017, Proceedings
EditorsSomnuk Phon-Amnuaisuk, Swee-Peng Ang, Soo-Young Lee
PublisherSpringer Verlag
Pages174-188
Number of pages15
ISBN (Print)9783319694559
DOIs
Publication statusPublished - 1 Jan 2017
Event11th Multi-disciplinary International Workshop on Artificial Intelligence, MIWAI 2017 - Gadong, Brunei Darussalam
Duration: 20 Nov 201722 Nov 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10607 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th Multi-disciplinary International Workshop on Artificial Intelligence, MIWAI 2017
Country/TerritoryBrunei Darussalam
CityGadong
Period20/11/1722/11/17

Keywords

  • Deep learning
  • Hierarchical Attention Network
  • Insufficiently supported argument
  • Shallow learning
  • XGBoost

Fingerprint

Dive into the research topics of 'Hierarchical attention network with XGBoost for recognizing insufficiently supported argument'. Together they form a unique fingerprint.

Cite this