6285641688335, 628551515511 info@scirepid.com

 
JTIE - Journal of Technology Informatics and Engineering - Vol. 3 Issue. 3 (2024)

Transformers in Cybersecurity: Advancing Threat Detection and Response through Machine Learning Architectures

Joseph Teguh Santoso, Budi Hartono, Fujiama Diapoldo Silalahi, Moh Muthohir,



Abstract

The increasing sophistication of cyber threats has outpaced the capabilities of traditional detection and response systems, necessitating the adoption of advanced machine learning architectures. This study investigates the application of Transformer-based models in cybersecurity, focusing on their ability to enhance threat detection and response. Leveraging publicly available datasets, including CICIDS 2017 and UNSW-NB15, the research employs a systematic methodology encompassing data preprocessing, model optimization, and comparative performance evaluation. The Transformer model, tailored for cybersecurity, integrates self-attention mechanisms and positional encoding to capture complex dependencies in network traffic data. The experimental results reveal that the proposed model achieves an accuracy of 97.8%, outperforming conventional methods such as Random Forest (92.3%) and deep learning approaches like CNN (94.1%) and LSTM (95.6%). Additionally, the Transformer demonstrates high detection rates across diverse attack types, with rates exceeding 98% for Denial of Service and Brute Force attacks. Attention heatmaps provide valuable insights into feature importance, enhancing the interpretability of the model’s decisions. Scalability tests confirm the model’s ability to handle large datasets efficiently, positioning it as a robust solution for dynamic cybersecurity environments. This research contributes to the field by demonstrating the feasibility and advantages of employing Transformer architectures for complex threat detection tasks. The findings have significant implications for developing scalable, interpretable, and adaptive cybersecurity systems. Future studies should explore lightweight Transformer variants and evaluate the model in operational environments to address practical deployment challenges.







Publisher :

Universitas Sains dan Teknologi Komputer

DOI :


Sitasi :

0

PISSN :

2961-9068

EISSN :

2961-8215

Date.Create Crossref:

26-Dec-2024

Date.Issue :

26-Dec-2024

Date.Publish :

26-Dec-2024

Date.PublishOnline :

26-Dec-2024



PDF File :

Resource :

Open

License :

https://creativecommons.org/licenses/by-sa/4.0