Description
This model is an adapted version of an mDeBERTa model, fine-tuned on the SQuAD v2 dataset for the COVID-19 domain and optimized for the Greek language.
Training Details
- Training Dataset:
COVID-QA-el-small
- Batch Size: 8
- Number of Epochs: 3
- Learning Rate: 3e-05
- Gradient Accumulation Steps: 2
- Downloads last month
- 411
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for panosgriz/mdeberta-v3-base-squad2-covid-el-small
Base model
timpal0l/mdeberta-v3-base-squad2