Advancements in Natural Language Processing: Leveraging Transformer Models for Multilingual Text Generation
Main Article Content
Abstract
Background: Recent advancements in Natural Language Processing (NLP) have revolutionized text generation techniques, with Transformer models becoming the cornerstone of modern NLP tasks, particularly in multilingual text generation. Objective: This study aims to examine the effectiveness of Transformer-based models in generating multilingual text across diverse languages, focusing on enhancing content fluency, coherence, and domain-specific applications. Method: The research utilizes a series of pre-trained Transformer models including BERT, GPT, mBERT, and XLM-R, trained on a multilingual corpus spanning 20+ languages. The study incorporates a comprehensive training process involving fine-tuning on specific tasks such as text summarization, content creation, and sentiment analysis. Evaluation metrics such as BLEU, ROUGE, and accuracy were used to assess the quality of generated content. Models were trained using high-performance computing resources to ensure scalability and efficiency. We also performed extensive comparison with traditional NLP approaches to demonstrate improvements in multilingual generation. Results: The Transformer models demonstrated considerable advancements in multilingual text generation. mBERT achieved an average BLEU score of 45%, surpassing traditional monolingual models by 20%. XLM-R, in particular, showed a remarkable 25% improvement in coherence across languages, including low-resource ones. The models generated high-quality content, with a 92% accuracy rate in task-specific domains. Furthermore, computational efficiency was enhanced by reducing resource usage by 30%, enabling scalable multilingual deployment. Conclusions: Transformer models show great promise in multilingual text generation, with notable improvements in translation quality, fluency, and efficiency. Future research should focus on reducing bias and further improving model scalability.
Article Details
Section

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.