Edit model card

GysBERT v1

This model is a Historical Language Model for Dutch coming from the MacBERTh project.

The architecture is based on BERT base uncased from the original BERT pre-training codebase. The training material comes mostly from the DBNL and the Delpher newspaper dump. The details can be found in the accompanying publication: Non-Parametric Word Sense Disambiguation for Historical Languages

The model has been successfully tested on Word Sense Disambiguation tasks as discussed in the referenced paper above.

An updated version with an enlarged pre-training dataset is due soon.

Downloads last month
538
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for emanjavacas/GysBERT

Finetunes
2 models