{"id":813169,"projects":[218],"description":"XLM-RoBERTa model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Unsupervised Cross-lingual Representation Learning at Scale by Conneau et al. and first released in this repository.\r\n\r\nDisclaimer: The team releasing XLM-RoBERTa did not write a model card for this model so this model card has been written by the Hugging Face team.","image":"","tags":"Fill-Mask, Transformers, PyTorch, TensorFlow, JAX, ONNX, Safetensors,  94 languages, xlm-roberta, exbert, Inference Endpoints, arxiv: 1911.02116,  License: mit","type":"","title":"XLM-RoBERTa (base-sized model)","url":"https://huggingface.co/FacebookAI/xlm-roberta-base","authors":[1411],"rubrics":[146]}