Edit model card

SetFit Aspect Model with firqaaa/indo-setfit-absa-bert-base-restaurants-aspect

This is a SetFit model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses firqaaa/indo-setfit-absa-bert-base-restaurants-aspect as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. In particular, this model is in charge of filtering aspect span candidates.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

This model was trained within the context of a larger system for ABSA, which looks like so:

  1. Use a spaCy model to select possible aspect span candidates.
  2. Use this SetFit model to filter these possible aspect span candidates.
  3. Use a SetFit model to classify the filtered aspect span candidates.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
aspect
  • 'story:saranku developer menciptakan story menarik kehilangan player player yg bertahan repetitif monoton update size gede doang yg isinya chest itupun sampah puzzle yg rumit chest nya sampah story kebanyakan npc teyvat story utama mc dilupain gak difokusin map kalo udah kosong ya nyampah bikin size gede doang main 3 monoton perkembangan buruk'
  • 'reward:tolong ditambah reward gachanya player kesulitan primo quest eksplorasi 100 dasar developer kapitalis game monoton ramah player kekurangan bahan gacha karakter'
  • 'event:cuman saran pelit biar player gak kabur game sebelah hadiah event quest perbaiki udah nunggu event hadiah cuman gitu gitu aja sampek event selesai primogemnya 10 pull gacha gak tingakat kesulitan beda hadiah main kabur kalok pelit 1 jariang mohon perbaiki server indonya trimaksih'
no aspect
  • 'saranku developer:saranku developer menciptakan story menarik kehilangan player player yg bertahan repetitif monoton update size gede doang yg isinya chest itupun sampah puzzle yg rumit chest nya sampah story kebanyakan npc teyvat story utama mc dilupain gak difokusin map kalo udah kosong ya nyampah bikin size gede doang main 3 monoton perkembangan buruk'
  • 'story:saranku developer menciptakan story menarik kehilangan player player yg bertahan repetitif monoton update size gede doang yg isinya chest itupun sampah puzzle yg rumit chest nya sampah story kebanyakan npc teyvat story utama mc dilupain gak difokusin map kalo udah kosong ya nyampah bikin size gede doang main 3 monoton perkembangan buruk'
  • 'kehilangan player player:saranku developer menciptakan story menarik kehilangan player player yg bertahan repetitif monoton update size gede doang yg isinya chest itupun sampah puzzle yg rumit chest nya sampah story kebanyakan npc teyvat story utama mc dilupain gak difokusin map kalo udah kosong ya nyampah bikin size gede doang main 3 monoton perkembangan buruk'

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import AbsaModel

# Download from the 🤗 Hub
model = AbsaModel.from_pretrained(
    "Funnyworld1412/ABSA_review_game_genshin_impact-aspect",
    "Funnyworld1412/ABSA_review_game_genshin_impact-polarity",
)
# Run inference
preds = model("The food was great, but the venue is just way too busy.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 4 31.2629 70
Label Training Sample Count
no aspect 1049
aspect 324

Training Hyperparameters

  • batch_size: (4, 4)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 10
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.0089 -
0.0073 50 0.7206 -
0.0146 100 0.399 -
0.0218 150 0.0596 -
0.0291 200 0.3335 -
0.0364 250 0.1854 -
0.0437 300 0.0708 -
0.0510 350 0.0161 -
0.0583 400 0.3364 -
0.0655 450 0.0949 -
0.0728 500 0.1021 -
0.0801 550 0.3917 -
0.0874 600 0.0707 -
0.0947 650 0.3885 -
0.1020 700 0.046 -
0.1092 750 0.001 -
0.1165 800 0.0024 -
0.1238 850 0.2384 -
0.1311 900 0.0215 -
0.1384 950 0.2283 -
0.1457 1000 0.4564 -
0.1529 1050 0.0017 -
0.1602 1100 0.0612 -
0.1675 1150 0.2325 -
0.1748 1200 0.0568 -
0.1821 1250 0.0096 -
0.1894 1300 0.2803 -
0.1966 1350 0.0056 -
0.2039 1400 0.0107 -
0.2112 1450 0.0042 -
0.2185 1500 0.0636 -
0.2258 1550 0.0356 -
0.2331 1600 0.2264 -
0.2403 1650 0.2335 -
0.2476 1700 0.201 -
0.2549 1750 0.0386 -
0.2622 1800 0.0032 -
0.2695 1850 0.0023 -
0.2768 1900 0.0053 -
0.2840 1950 0.0228 -
0.2913 2000 0.0006 -
0.2986 2050 0.0003 -
0.3059 2100 0.0142 -
0.3132 2150 0.099 -
0.3205 2200 0.0144 -
0.3277 2250 0.0002 -
0.3350 2300 0.0042 -
0.3423 2350 0.0359 -
0.3496 2400 0.0004 -
0.3569 2450 0.0057 -
0.3642 2500 0.0046 -
0.3714 2550 0.0015 -
0.3787 2600 0.0023 -
0.3860 2650 0.0004 -
0.3933 2700 0.0002 -
0.4006 2750 0.0002 -
0.4079 2800 0.0267 -
0.4151 2850 0.0001 -
0.4224 2900 0.0003 -
0.4297 2950 0.0037 -
0.4370 3000 0.0005 -
0.4443 3050 0.0049 -
0.4516 3100 0.2431 -
0.4588 3150 0.2577 -
0.4661 3200 0.1556 -
0.4734 3250 0.1983 -
0.4807 3300 0.0884 -
0.4880 3350 0.0003 -
0.4953 3400 0.2302 -
0.5025 3450 0.0007 -
0.5098 3500 0.0002 -
0.5171 3550 0.0001 -
0.5244 3600 0.0845 -
0.5317 3650 0.0003 -
0.5390 3700 0.0001 -
0.5462 3750 0.0001 -
0.5535 3800 0.0 -
0.5608 3850 0.0001 -
0.5681 3900 0.001 -
0.5754 3950 0.0008 -
0.5827 4000 0.002 -
0.5899 4050 0.0002 -
0.5972 4100 0.1071 -
0.6045 4150 0.0001 -
0.6118 4200 0.0001 -
0.6191 4250 0.0001 -
0.6264 4300 0.0002 -
0.6336 4350 0.0001 -
0.6409 4400 0.0 -
0.6482 4450 0.2478 -
0.6555 4500 0.0 -
0.6628 4550 0.0003 -
0.6701 4600 0.0 -
0.6773 4650 0.0002 -
0.6846 4700 0.003 -
0.6919 4750 0.0007 -
0.6992 4800 0.0006 -
0.7065 4850 0.001 -
0.7138 4900 0.0106 -
0.7210 4950 0.0001 -
0.7283 5000 0.0002 -
0.7356 5050 0.0004 -
0.7429 5100 0.0008 -
0.7502 5150 0.0508 -
0.7575 5200 0.001 -
0.7647 5250 0.0 -
0.7720 5300 0.0249 -
0.7793 5350 0.0001 -
0.7866 5400 0.1026 -
0.7939 5450 0.0 -
0.8012 5500 0.0001 -
0.8084 5550 0.0028 -
0.8157 5600 0.0008 -
0.8230 5650 0.0002 -
0.8303 5700 0.0001 -
0.8376 5750 0.0 -
0.8449 5800 0.0001 -
0.8521 5850 0.0001 -
0.8594 5900 0.0094 -
0.8667 5950 0.0001 -
0.8740 6000 0.0 -
0.8813 6050 0.0 -
0.8886 6100 0.0 -
0.8958 6150 0.0001 -
0.9031 6200 0.0002 -
0.9104 6250 0.0026 -
0.9177 6300 0.1005 -
0.9250 6350 0.0002 -
0.9323 6400 0.0004 -
0.9395 6450 0.2456 -
0.9468 6500 0.0228 -
0.9541 6550 0.022 -
0.9614 6600 0.025 -
0.9687 6650 0.0002 -
0.9760 6700 0.0003 -
0.9832 6750 0.0001 -
0.9905 6800 0.0 -
0.9978 6850 0.1145 -
1.0 6865 - 0.1868

Framework Versions

  • Python: 3.10.13
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.1
  • spaCy: 3.7.5
  • Transformers: 4.36.2
  • PyTorch: 2.1.2
  • Datasets: 2.19.2
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
1
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Funnyworld1412/ABSA_review_game_genshin_impact-aspect