Abstract
Pantun is a traditional Malay poem consisting of four lines: two lines of deliverance and two lines of messages. Each ending-line word in pantun forms an ABAB rhyme pattern. In this work, we compare the performance of Sequence Generative Adversarial Nets (SeqGAN) and Generative Pre-trained Transformer 2 (GPT-2) in automatically generating Indonesian pantun. We also created the first publicly available Indonesian pantun dataset that consists of 7.8K pantun. We evaluated how well each model produced pantun by its lexical richness and its formedness. We introduced the evaluation of pantun with two aspects: structure and rhyme. GPT-2 performs better with a margin of 29.40% than SeqGAN in forming the structure, 35.20% better in making rhyming patterns, and 0.04 difference in giving richer vocabulary to its generated pantun.
Original language | English |
---|---|
Pages (from-to) | 59-67 |
Journal | Jurnal Ilmu Komputer dan Informasi |
Volume | 16 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2023 |
Keywords
- poetry generation
- pantun
- text generation
- SeqGAN
- GPT-2