Differences with https://huggingface.co./datasets/wikimedia/wikipedia

#22
by ola13 - opened

Hi I'd like to understand how this dataset differs from the !ikimedia foundations one here https://huggingface.co./datasets/wikimedia/wikipedia - any recommendation regarding which one is better to use?

This one will take hours to build (or maybe even more for the largest languages), and multithread building may be broken, but you can get the most recent data from the source if you want. That one is already preprocessed so it's ready to use, but you only get 202311.

I see, thanks for the context @guillermogabrielli !

ola13 changed discussion status to closed

Update on my previous answer:

It's possible to use https://huggingface.co./datasets/wikimedia/wikipedia with any valid date by specifying revision='script' in the load_dataset code to use the script branch which contains the updated builder.
You can see https://dumps.wikimedia.org/backup-index.html to see what's the latest dump date, then navigate to the dump page for enwiki (e.g. https://dumps.wikimedia.org/enwiki/20240701/) or another wiki to see the status of the dump (done or in-progress) and the previous dump date, if the latest one is not yet completed.
You must limit the worker count to 2 to avoid issues, or use a mirror which allows more parallel downloads.

Sign up or log in to comment