Edit model card

Model Card for mt5-small en-nl translation

The mt5-small en-nl translation model is a finetuned version of google/mt5-small.

It was finetuned on 237k rows of the iwslt2017 dataset and roughly 38k rows of the opus_books dataset. The model was trained for 15 epochs with a batchsize of 16.

How to use

Install dependencies

pip install transformers
pip install sentencepiece
pip install protobuf

You can use the following code for model inference. This model was finetuned to work with an identifier when prompted that needs to be present for the best results.

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, GenerationConfig

# load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("Michielo/mt5-small_en-nl_translation")
model = AutoModelForSeq2SeqLM.from_pretrained("Michielo/mt5-small_en-nl_translation")

# tokenize input
inputs = tokenizer(">>nl<< Your English text here", return_tensors="pt")
# calculate the output
outputs = model.generate(**inputs)
# decode and print
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))

Benchmarks

You can replicate our benchmark scores here without writing any code yourself.

Benchmark Score
BLEU 43.63%
chr-F 62.25%
chr-F++ 61.87%

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Downloads last month
16
Safetensors
Model size
300M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Michielo/mt5-small_en-nl_translation

Base model

google/mt5-small
Finetuned
(301)
this model

Datasets used to train Michielo/mt5-small_en-nl_translation