International Journal of Technology and Applied Science
E-ISSN: 2230-9004
•
Impact Factor: 9.914
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 17 Issue 5
May 2026
Indexing Partners
Structured Analysis of the Transformer Architecture Within the Context of Machine Translation (MT)
| Author(s) | DR. S. THILAGAVATHI, DR. P. BALAMUTHUKUMAR |
|---|---|
| Country | India |
| Abstract | The introduction of the Transformer architecture in 2017 revolutionized the field of Machine Translation by shifting the paradigm from recurrent, sequential processing to a parallelizable, attention-based framework. By utilizing the self-attention mechanism, the Transformer effectively captures long-range dependencies in text, overcoming the limitations of previous architectures like Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) units. This report analyzes the architecture, its comparative advantages, technical challenges, and its role as the foundation for modern Natural Language Processing (NLP). |
| Keywords | Machine Translation (MT), Neural Machine Translation, Parallelization, Long-Range Dependency Modeling, Global Context |
| Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
| Published In | Volume 17, Issue 5, May 2026 |
| Published On | 2026-05-10 |
| Cite This | Structured Analysis of the Transformer Architecture Within the Context of Machine Translation (MT) - DR. S. THILAGAVATHI, DR. P. BALAMUTHUKUMAR - IJTAS Volume 17, Issue 5, May 2026. |
Share this

CrossRef DOI is assigned to each research paper published in our journal.
IJTAS DOI prefix is
10.71097/IJTAS
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.