rttl-ai/SentyBert
Model Details
Model Description: This model is a fine-tune checkpoint of bert-large-uncased, fine-tuned on SST-2. This model reaches an accuracy of 99.92 on the dev set.
- Developed by: rttl-ai
- Model Type: Text Classification
- Language(s): English
- License: Apache-2.0
- Resources for more information:
- The model was pre-trained with task-adaptive pre-training TAPT with an increased masking rate, no corruption strategy, and using WWM, following this paper
- fine-tuned on sst with subtrees
- fine-tuned on sst2
- Downloads last month
- 12
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Datasets used to train rttl-ai/bert-large-uncased-sentiment
Evaluation results
- F1 Macro on sst2validation set self-reported0.999
- Accuracy on sst2validation set self-reported0.999