SetFit Aspect Model with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. In particular, this model is in charge of filtering aspect span candidates.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

This model was trained within the context of a larger system for ABSA, which looks like so:

  1. Use a spaCy model to select possible aspect span candidates.
  2. Use this SetFit model to filter these possible aspect span candidates.
  3. Use a SetFit model to classify the filtered aspect span candidates.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
no aspect
  • 'Tesla:Tesla could deliver 500K+ vehicles in Q4, increasing annual deliveries by 50%. Due to headwinds in 2022, now the manufacturer is ramping up production even harder to get as many EVs on the road as possible\n\n #Tesla $TSLA \nhttps://t.co/b2NCtZqDYn'
  • 'vehicles:Tesla could deliver 500K+ vehicles in Q4, increasing annual deliveries by 50%. Due to headwinds in 2022, now the manufacturer is ramping up production even harder to get as many EVs on the road as possible\n\n #Tesla $TSLA \nhttps://t.co/b2NCtZqDYn'
  • 'Q4:Tesla could deliver 500K+ vehicles in Q4, increasing annual deliveries by 50%. Due to headwinds in 2022, now the manufacturer is ramping up production even harder to get as many EVs on the road as possible\n\n #Tesla $TSLA \nhttps://t.co/b2NCtZqDYn'
aspect
  • "profit:I'm pretty sure, all an EV tax incentive will do, is raise the price of Teslas, at least for the next few years.\n\ni.e. just more profit for $TSLA\nAs if demand wasn't abundant enough already."
  • "price:I'm pretty sure, all an EV tax incentive will do, is raise the price of Teslas, at least for the next few years.\n\ni.e. just more profit for $TSLA\nAs if demand wasn't abundant enough already."
  • 'car:John Hennessey gets a $TSLA Plaid. \nA retired OEM executive describes Tesla as a $30k car with $70k in batteries. \nThe perfect description of a Tesla https://t.co/m5J5m3AuMJ'

Evaluation

Metrics

Label Accuracy
all 0.9798

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import AbsaModel

# Download from the 🤗 Hub
model = AbsaModel.from_pretrained(
    "NazmusAshrafi/setfit-MiniLM-mpnet-absa-tesla-tweet-aspect",
    "NazmusAshrafi/setfit-MiniLM-mpnet-absa-tesla-tweet-polarity",
)
# Run inference
preds = model("The food was great, but the venue is just way too busy.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 11 41.4789 57
Label Training Sample Count
no aspect 560
aspect 33

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (1, 16)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2511 -
0.0025 50 0.2558 -
0.0051 100 0.2147 -
0.0076 150 0.2265 -
0.0101 200 0.2474 -
0.0127 250 0.2286 -
0.0152 300 0.1717 -
0.0178 350 0.0737 -
0.0203 400 0.0231 -
0.0228 450 0.0069 -
0.0254 500 0.0032 -
0.0279 550 0.002 -
0.0304 600 0.0008 -
0.0330 650 0.0023 -
0.0355 700 0.002 -
0.0381 750 0.0008 -
0.0406 800 0.0019 -
0.0431 850 0.0003 -
0.0457 900 0.0004 -
0.0482 950 0.0005 -
0.0507 1000 0.0003 -
0.0533 1050 0.0006 -
0.0558 1100 0.0071 -
0.0584 1150 0.0001 -
0.0609 1200 0.0001 -
0.0634 1250 0.0001 -
0.0660 1300 0.0001 -
0.0685 1350 0.0004 -
0.0710 1400 0.0001 -
0.0736 1450 0.0002 -
0.0761 1500 0.0002 -
0.0787 1550 0.0002 -
0.0812 1600 0.0001 -
0.0837 1650 0.0001 -
0.0863 1700 0.0007 -
0.0888 1750 0.0001 -
0.0913 1800 0.0002 -
0.0939 1850 0.0011 -
0.0964 1900 0.0007 -
0.0990 1950 0.001 -
0.1015 2000 0.0003 -
0.1040 2050 0.0004 -
0.1066 2100 0.0006 -
0.1091 2150 0.0004 -
0.1116 2200 0.0 -
0.1142 2250 0.0 -
0.1167 2300 0.0001 -
0.1193 2350 0.0017 -
0.1218 2400 0.0007 -
0.1243 2450 0.0023 -
0.1269 2500 0.0 -
0.1294 2550 0.0 -
0.1319 2600 0.0007 -
0.1345 2650 0.0 -
0.1370 2700 0.0004 -
0.1396 2750 0.0001 -
0.1421 2800 0.0002 -
0.1446 2850 0.0019 -
0.1472 2900 0.0002 -
0.1497 2950 0.0001 -
0.1522 3000 0.0 -
0.1548 3050 0.0001 -
0.1573 3100 0.0 -
0.1598 3150 0.0001 -
0.1624 3200 0.0007 -
0.1649 3250 0.0 -
0.1675 3300 0.0002 -
0.1700 3350 0.0004 -
0.1725 3400 0.0 -
0.1751 3450 0.0 -
0.1776 3500 0.0 -
0.1801 3550 0.0 -
0.1827 3600 0.0001 -
0.1852 3650 0.0 -
0.1878 3700 0.0001 -
0.1903 3750 0.0 -
0.1928 3800 0.0 -
0.1954 3850 0.0 -
0.1979 3900 0.0 -
0.2004 3950 0.0 -
0.2030 4000 0.0 -
0.2055 4050 0.0019 -
0.2081 4100 0.0 -
0.2106 4150 0.0001 -
0.2131 4200 0.0 -
0.2157 4250 0.0 -
0.2182 4300 0.0 -
0.2207 4350 0.0 -
0.2233 4400 0.0005 -
0.2258 4450 0.0 -
0.2284 4500 0.0 -
0.2309 4550 0.0 -
0.2334 4600 0.0 -
0.2360 4650 0.0 -
0.2385 4700 0.0009 -
0.2410 4750 0.0 -
0.2436 4800 0.0 -
0.2461 4850 0.0 -
0.2487 4900 0.0002 -
0.2512 4950 0.0 -
0.2537 5000 0.0011 -
0.2563 5050 0.0 -
0.2588 5100 0.0 -
0.2613 5150 0.0 -
0.2639 5200 0.0 -
0.2664 5250 0.0 -
0.2690 5300 0.0 -
0.2715 5350 0.0026 -
0.2740 5400 0.0 -
0.2766 5450 0.0021 -
0.2791 5500 0.0 -
0.2816 5550 0.0001 -
0.2842 5600 0.0 -
0.2867 5650 0.0001 -
0.2893 5700 0.0 -
0.2918 5750 0.0 -
0.2943 5800 0.0 -
0.2969 5850 0.0 -
0.2994 5900 0.0 -
0.3019 5950 0.0 -
0.3045 6000 0.0 -
0.3070 6050 0.0 -
0.3096 6100 0.0 -
0.3121 6150 0.0003 -
0.3146 6200 0.0 -
0.3172 6250 0.0 -
0.3197 6300 0.0 -
0.3222 6350 0.0001 -
0.3248 6400 0.0009 -
0.3273 6450 0.0 -
0.3298 6500 0.0 -
0.3324 6550 0.0 -
0.3349 6600 0.0 -
0.3375 6650 0.0 -
0.3400 6700 0.0 -
0.3425 6750 0.0 -
0.3451 6800 0.0 -
0.3476 6850 0.0 -
0.3501 6900 0.0 -
0.3527 6950 0.0 -
0.3552 7000 0.0 -
0.3578 7050 0.0 -
0.3603 7100 0.0536 -
0.3628 7150 0.0 -
0.3654 7200 0.0 -
0.3679 7250 0.0 -
0.3704 7300 0.0 -
0.3730 7350 0.0 -
0.3755 7400 0.0 -
0.3781 7450 0.0 -
0.3806 7500 0.0 -
0.3831 7550 0.0 -
0.3857 7600 0.0 -
0.3882 7650 0.0 -
0.3907 7700 0.0 -
0.3933 7750 0.0019 -
0.3958 7800 0.0 -
0.3984 7850 0.0 -
0.4009 7900 0.0548 -
0.4034 7950 0.0 -
0.4060 8000 0.0053 -
0.4085 8050 0.0 -
0.4110 8100 0.0 -
0.4136 8150 0.0 -
0.4161 8200 0.0 -
0.4187 8250 0.0624 -
0.4212 8300 0.0622 -
0.4237 8350 0.0618 -
0.4263 8400 0.0001 -
0.4288 8450 0.0 -
0.4313 8500 0.0001 -
0.4339 8550 0.0 -
0.4364 8600 0.0 -
0.4390 8650 0.0 -
0.4415 8700 0.0012 -
0.4440 8750 0.0001 -
0.4466 8800 0.0005 -
0.4491 8850 0.0 -
0.4516 8900 0.0 -
0.4542 8950 0.0 -
0.4567 9000 0.0 -
0.4593 9050 0.0 -
0.4618 9100 0.0 -
0.4643 9150 0.0 -
0.4669 9200 0.0 -
0.4694 9250 0.0408 -
0.4719 9300 0.0498 -
0.4745 9350 0.0 -
0.4770 9400 0.0 -
0.4795 9450 0.0017 -
0.4821 9500 0.0 -
0.4846 9550 0.0 -
0.4872 9600 0.0 -
0.4897 9650 0.0 -
0.4922 9700 0.0 -
0.4948 9750 0.0 -
0.4973 9800 0.0589 -
0.4998 9850 0.0 -
0.5024 9900 0.0 -
0.5049 9950 0.0015 -
0.5075 10000 0.0 -
0.5100 10050 0.0 -
0.5125 10100 0.0 -
0.5151 10150 0.0 -
0.5176 10200 0.0 -
0.5201 10250 0.0 -
0.5227 10300 0.0013 -
0.5252 10350 0.0023 -
0.5278 10400 0.0 -
0.5303 10450 0.0 -
0.5328 10500 0.0 -
0.5354 10550 0.0003 -
0.5379 10600 0.0 -
0.5404 10650 0.0 -
0.5430 10700 0.0002 -
0.5455 10750 0.0 -
0.5481 10800 0.0 -
0.5506 10850 0.0005 -
0.5531 10900 0.0 -
0.5557 10950 0.0 -
0.5582 11000 0.0 -
0.5607 11050 0.0 -
0.5633 11100 0.0 -
0.5658 11150 0.0 -
0.5684 11200 0.0 -
0.5709 11250 0.0 -
0.5734 11300 0.0 -
0.5760 11350 0.0008 -
0.5785 11400 0.0 -
0.5810 11450 0.0024 -
0.5836 11500 0.0 -
0.5861 11550 0.0 -
0.5887 11600 0.0 -
0.5912 11650 0.0 -
0.5937 11700 0.001 -
0.5963 11750 0.0 -
0.5988 11800 0.0 -
0.6013 11850 0.0 -
0.6039 11900 0.0527 -
0.6064 11950 0.0021 -
0.6090 12000 0.0 -
0.6115 12050 0.0 -
0.6140 12100 0.0 -
0.6166 12150 0.0 -
0.6191 12200 0.0 -
0.6216 12250 0.0 -
0.6242 12300 0.0 -
0.6267 12350 0.0006 -
0.6292 12400 0.0 -
0.6318 12450 0.0 -
0.6343 12500 0.001 -
0.6369 12550 0.0017 -
0.6394 12600 0.0 -
0.6419 12650 0.0 -
0.6445 12700 0.0 -
0.6470 12750 0.0012 -
0.6495 12800 0.0 -
0.6521 12850 0.0 -
0.6546 12900 0.0 -
0.6572 12950 0.0434 -
0.6597 13000 0.0 -
0.6622 13050 0.0 -
0.6648 13100 0.0003 -
0.6673 13150 0.0 -
0.6698 13200 0.0 -
0.6724 13250 0.0003 -
0.6749 13300 0.0 -
0.6775 13350 0.0 -
0.6800 13400 0.0005 -
0.6825 13450 0.0 -
0.6851 13500 0.0011 -
0.6876 13550 0.0475 -
0.6901 13600 0.0 -
0.6927 13650 0.0007 -
0.6952 13700 0.0 -
0.6978 13750 0.0 -
0.7003 13800 0.0 -
0.7028 13850 0.0 -
0.7054 13900 0.0 -
0.7079 13950 0.0015 -
0.7104 14000 0.0034 -
0.7130 14050 0.0009 -
0.7155 14100 0.0 -
0.7181 14150 0.0009 -
0.7206 14200 0.0 -
0.7231 14250 0.0003 -
0.7257 14300 0.0004 -
0.7282 14350 0.0 -
0.7307 14400 0.0003 -
0.7333 14450 0.0 -
0.7358 14500 0.0 -
0.7384 14550 0.0 -
0.7409 14600 0.0 -
0.7434 14650 0.0 -
0.7460 14700 0.0018 -
0.7485 14750 0.0012 -
0.7510 14800 0.0 -
0.7536 14850 0.0 -
0.7561 14900 0.0013 -
0.7587 14950 0.0 -
0.7612 15000 0.0 -
0.7637 15050 0.0 -
0.7663 15100 0.0 -
0.7688 15150 0.0 -
0.7713 15200 0.0 -
0.7739 15250 0.0 -
0.7764 15300 0.0 -
0.7790 15350 0.0 -
0.7815 15400 0.0 -
0.7840 15450 0.0 -
0.7866 15500 0.0 -
0.7891 15550 0.0 -
0.7916 15600 0.0004 -
0.7942 15650 0.0005 -
0.7967 15700 0.0 -
0.7992 15750 0.0 -
0.8018 15800 0.0 -
0.8043 15850 0.0 -
0.8069 15900 0.0 -
0.8094 15950 0.0555 -
0.8119 16000 0.0 -
0.8145 16050 0.0 -
0.8170 16100 0.0 -
0.8195 16150 0.0 -
0.8221 16200 0.0 -
0.8246 16250 0.0007 -
0.8272 16300 0.0 -
0.8297 16350 0.0 -
0.8322 16400 0.0 -
0.8348 16450 0.0003 -
0.8373 16500 0.0 -
0.8398 16550 0.0012 -
0.8424 16600 0.0 -
0.8449 16650 0.0 -
0.8475 16700 0.0 -
0.8500 16750 0.0 -
0.8525 16800 0.0 -
0.8551 16850 0.0 -
0.8576 16900 0.0007 -
0.8601 16950 0.0 -
0.8627 17000 0.001 -
0.8652 17050 0.0 -
0.8678 17100 0.0 -
0.8703 17150 0.0 -
0.8728 17200 0.0 -
0.8754 17250 0.0 -
0.8779 17300 0.0 -
0.8804 17350 0.0 -
0.8830 17400 0.0007 -
0.8855 17450 0.0 -
0.8881 17500 0.0 -
0.8906 17550 0.0505 -
0.8931 17600 0.0 -
0.8957 17650 0.0 -
0.8982 17700 0.0008 -
0.9007 17750 0.0 -
0.9033 17800 0.0003 -
0.9058 17850 0.0 -
0.9084 17900 0.0 -
0.9109 17950 0.0009 -
0.9134 18000 0.0 -
0.9160 18050 0.0 -
0.9185 18100 0.0 -
0.9210 18150 0.0 -
0.9236 18200 0.0 -
0.9261 18250 0.0 -
0.9287 18300 0.0 -
0.9312 18350 0.0008 -
0.9337 18400 0.0 -
0.9363 18450 0.0 -
0.9388 18500 0.0 -
0.9413 18550 0.0 -
0.9439 18600 0.0 -
0.9464 18650 0.0 -
0.9489 18700 0.0 -
0.9515 18750 0.0 -
0.9540 18800 0.0 -
0.9566 18850 0.0 -
0.9591 18900 0.0 -
0.9616 18950 0.0 -
0.9642 19000 0.0 -
0.9667 19050 0.0 -
0.9692 19100 0.0 -
0.9718 19150 0.0 -
0.9743 19200 0.0 -
0.9769 19250 0.0 -
0.9794 19300 0.0005 -
0.9819 19350 0.0 -
0.9845 19400 0.0 -
0.9870 19450 0.0 -
0.9895 19500 0.0 -
0.9921 19550 0.0011 -
0.9946 19600 0.0 -
0.9972 19650 0.0 -
0.9997 19700 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.2.2
  • spaCy: 3.6.1
  • Transformers: 4.35.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.16.1
  • Tokenizers: 0.15.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
19
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for NazmusAshrafi/setfit-MiniLM-mpnet-absa-tesla-tweet-aspect

Finetuned
(256)
this model

Evaluation results