AN ENHANCED BIDIRECTIONAL ENCODER TRANSFORMERS WITH RELATIVE POSITION FOR INDONESIAN SKILL RECOGNITION

Tentua, Meilany Nonsi and Suprapto, Suprapto and Afiahayati, Afiahayati (2024) AN ENHANCED BIDIRECTIONAL ENCODER TRANSFORMERS WITH RELATIVE POSITION FOR INDONESIAN SKILL RECOGNITION. ICIC Express Letters, 18 (4). pp. 325-332. ISSN 1881803X

[thumbnail of 54.el-18-04-02.pdf] Text
54.el-18-04-02.pdf - Published Version
Restricted to Registered users only

Download (446kB) | Request a copy

Abstract

This paper presents an approach to improve Indonesian skill recognition using an enhanced bidirectional encoder transformers with relative position (EBERT-RP). The proposed method aims to overcome the challenges in recognizing Indonesian skills due to the complexity of the Indonesian language and the lack of annotated data. The EBERT-RP model incorporates relative position embeddings, which allow the model to capture the relative positions of tokens in a sentence, and a novel attention mechanism that improves the model’s ability to attend the critical information. To evaluate the performance of the EBERT-RP model, we conducted experiments on a dataset of Indonesian skill recognition task. Our results show that the EBERT-RP model outperforms other state-of-the-art models, achieving an F1-score of 90.2% on the test set. Furthermore, we conducted an ablation study to analyze the contribution of the relative position embeddings and the attention mechanism to the model’s performance. The results show that the relative position embeddings and the attention mechanism are crucial for high performance.

Item Type: Article
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Mathematics and Natural Sciences > Computer Science & Electronics Department
Depositing User: Ismu WIDARTO
Date Deposited: 07 Jul 2025 07:23
Last Modified: 07 Jul 2025 07:23
URI: https://ir.lib.ugm.ac.id/id/eprint/19669

Actions (login required)

View Item
View Item