Download PDFOpen PDF in browser

Cross-Lingual NLP: Transfer Learning and Multilingual Models for Low-Resource Languages

EasyChair Preprint no. 12273

8 pagesDate: February 24, 2024

Abstract

This paper explores the role of transfer learning and multilingual models in addressing the challenges of low-resource languages, where limited data availability poses a significant obstacle to traditional NLP approaches. Transfer learning, a technique where knowledge gained from training on one task is applied to a different but related task, has emerged as a powerful tool in NLP. By pre-training models on high-resource languages and fine-tuning them on low-resource languages, transfer learning facilitates effective utilization of limited data, thereby improving performance on various NLP tasks. Multilingual models, designed to handle multiple languages within a single framework, offer another promising approach for low-resource language scenarios.

Keyphrases: learning, models, multilingual

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:12273,
  author = {Kurez Oroy and Jhon Danny},
  title = {Cross-Lingual NLP: Transfer Learning and Multilingual Models for Low-Resource Languages},
  howpublished = {EasyChair Preprint no. 12273},

  year = {EasyChair, 2024}}
Download PDFOpen PDF in browser