Skip to content

MaLA-500: Massive Language Adaptation of Large Language Models

Notifications You must be signed in to change notification settings

MaLA-LM/mala-500

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MaLA-500: Massive Language Adaptation of Large Language Models

Model Data arXiv

MaLA-500 is a novel large language model designed to cover an extensive range of 534 languages. This model builds upon LLaMA 2 7B and integrates continued pretraining with vocabulary extension, with an expanded vocabulary size of 260,164, and LoRA low-rank adaptation.

  • Continued Pretraining: Enhances the model's ability to adapt to a wide range of languages.
  • LoRA Low-Rank Adaptation: LoRA low-rank adaptation refines the model's adaptation capabilities.
  • Vocabulary Extension: MaLA-500 boasts an extended vocabulary size of 260,164.
  • Multilingual Proficiency: Trained on Glot500-c, covering 534 languages.

Please refer to our paper for more details.

How to Get Started with the Model

Requirements:

transformers>=4.36.1
peft>=0.6.2

Use the code below to get started with the model.

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

base_model = AutoModelForCausalLM.from_pretrained('meta-llama/Llama-2-7b-hf')
base_model.resize_token_embeddings(260164)
tokenizer = AutoTokenizer.from_pretrained('MaLA-LM/mala-500-10b-v2')
model = PeftModel.from_pretrained(base_model, 'MaLA-LM/mala-500-10b-v2')

Codebase

Vocabulary Extension

Code under the directory of ./tokenization.

Continued Pretraining

Code under the directory of ./continued_pretraining.

Customize the run.sh script for your own clusters or workstation. The script is provided for SLURM-based systems. You might want to use DeepSpeed. See config examples under ./continued_pretraining/config.

Evaluation

Code under the directory of ./evaluation.

Citation

@misc{lin2024mala500,
      title={MaLA-500: Massive Language Adaptation of Large Language Models}, 
      author={Peiqin Lin and Shaoxiong Ji and Jörg Tiedemann and André F. T. Martins and Hinrich Schütze},
      year={2024},
      eprint={2401.13303},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

MaLA-500: Massive Language Adaptation of Large Language Models

Topics

Resources

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •