- Volume: 4,
Issue: 1,
Sitasi : 0
Abstrak:
The persistent challenges of polysemy and ambiguity continue to hinder the semantic accuracy of Neural Machine Translation (NMT), particularly in language pairs with distinct syntactic structures. While transformer-based models such as BERT and GPT have achieved notable progress in capturing contextual word meanings, they still fall short in understanding explicit semantic roles. This study aims to address this limitation by integrating Semantic Role Labeling (SRL) into a Transformer-based NMT framework to enhance semantic comprehension and reduce translation errors. Using a parallel corpus of 100,000 English-Indonesian and English-Japanese sentence pairs, the proposed SRL-enhanced NMT model was trained and evaluated against a baseline Transformer NMT. The integration of SRL enabled the model to annotate semantic roles, such as agent, patient, and instrument, which were fused with encoder representations through semantic-aware attention mechanisms. Experimental results demonstrate that the SRL-integrated model significantly outperformed the standard NMT model, improving BLEU scores by 6.2 points (from 32.5 to 38.7), METEOR scores by 6.3 points (from 58.5 to 64.8), and reducing the TER by 5.8 points (from 45.1 to 39.3). These results were statistically validated using a paired t-test (p < 0.05). Furthermore, qualitative analyses confirmed SRL's effectiveness in resolving lexical ambiguities and syntactic uncertainties. Although SRL integration increased inference time by 12%, the performance trade-off was deemed acceptable for applications requiring higher semantic fidelity. The novelty of this research lies in the architectural fusion of SRL with transformer-based attention layers in NMT, a domain seldom explored in prior studies. Moreover, the model demonstrates robust performance across linguistically divergent language pairs, suggesting its broader applicability. This work contributes to the advancement of semantically aware translation systems and paves the way for future research in unsupervised SRL integration and multilingual scalability.