Papers
arxiv:2410.23825

GlotCC: An Open Broad-Coverage CommonCrawl Corpus and Pipeline for Minority Languages

Published on Oct 31
· Submitted by kargaranamir on Nov 1
Authors:
,

Abstract

The need for large text corpora has increased with the advent of pretrained language models and, in particular, the discovery of scaling laws for these models. Most available corpora have sufficient data only for languages with large dominant communities. However, there is no corpus available that (i) covers a wide range of minority languages; (ii) is generated by an open-source reproducible pipeline; and (iii) is rigorously cleaned from noise, making it trustworthy to use. We present GlotCC, a clean, document-level, 2TB general domain corpus derived from CommonCrawl, covering more than 1000 languages. We make GlotCC and the system used to generate it - including the pipeline, language identification model, and filters - available to the research community. Corpus v. 1.0 https://huggingface.co./datasets/cis-lmu/GlotCC-v1, Pipeline v. 3.0 https://github.com/cisnlp/GlotCC.

Community

Paper author Paper submitter

GlotCC is here! 💥 (Accepted at NeurIPS 2024!)

How can we scale NLP research to 1,000 languages? We built an open-source corpus and pipeline, including a LangID model, to mine data from web.

Paper: https://arxiv.org/abs/2410.23825
Corpus: https://huggingface.co./datasets/cis-lmu/GlotCC-V1
Our pipeline and homepage for GlotCC: https://github.com/cisnlp/GlotCC

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2410.23825 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2410.23825 in a Space README.md to link it from this page.

Collections including this paper 3