Ismi, Dewi Pramudi and Pulungan, Reza and Afiahayati, Afiahayati (2024) Self-attention and asymmetric multi-layer perceptron-gated recurrent unit blocks for protein secondary structure prediction. Applied Soft Computing, 159: 111604. ISSN 15684946
![[thumbnail of 124 self-attention.pdf]](https://ir.lib.ugm.ac.id/style/images/fileicons/text.png)
124 self-attention.pdf - Published Version
Restricted to Registered users only
Download (3MB) | Request a copy
Abstract
Protein secondary structure prediction (PSSP) is one of the most prominent and widely-conducted tasks in Bioinformatics. Deep neural networks have become the primary methods for building PSSP models in the last decade due to their potential to enhance PSSP performances. However, there is room for improvement in PSSP as previous studies have yet to reach the theoretical limit of PSSP model performance. In this work, we propose a PSSP model called SADGRU-SS, which is built with a novel and unique deep learning architecture that utilizes self-attention, asymmetric multi-layer perceptron (MLP)-gated recurrent unit (GRU) blocks, and a dense block for solving the PSSP problem. Our experiment results show that using self-attention in the SADGRU-SS architecture has successfully increased SADGRU-SS performance. Moreover, installing self-attention in the frontmost position of the networks produces better performance than locating it in other positions. Using the asymmetric configuration in the MLP-GRU blocks results in more excellent performance than the symmetric ones. Our model is trained using the standard CB6133-filtered dataset. We evaluate the performance of our model using the standard CB513 test dataset. Our experiment shows that the performance of our model on 8-state PSSP outstands other PSSP models. The model achieves 70.74% and 82.78% prediction accuracy in the 8-state and 3-state PSSP, respectively.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Asymmetric MLP-GRU block; Deep learning; Prediction accuracy; Protein secondary structure prediction; Self-attention |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Faculty of Mathematics and Natural Sciences > Computer Science & Electronics Department |
Depositing User: | Wiyarsih Wiyarsih |
Date Deposited: | 06 May 2025 07:59 |
Last Modified: | 06 May 2025 07:59 |
URI: | https://ir.lib.ugm.ac.id/id/eprint/17091 |