BatterySciBERT-uncased for Battery Abstract Multi-label Classification

This new model is a fine-tuned version of the BatterySciBERT-uncased model on a few-sample dataset of 1140 abstract of paper.
This model is uncased.

Hyperparameters

batch_size = 4
n_epochs = 16
base_LM_model = "batteryscibert-uncased"
learning_rate = 3e-5

Performance

"Validation Micro F1-score": 94.54,
"Test Micro F1-score": 93.42,

Details on the test set

Predicted Label Precision Recall F1 score
Coating 95.83% 76.67% 85.19%
Computation 86.96% 90.90% 88.89%
Doping 96.30% 100% 98.11%
Experiment 98.02% 93.40% 95.65%
Sodium layered oxide cathode 93.75% 91.84% 92.78%
Aggregate Metric Micro average 95.51% 91.42% 93.42%
Macro average 94.17% 90.56% 92.12%

Use in Transformers

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "NoWayBack/batteryscibert-uncased-abstract-mtc"

# Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name, top_k=5)
input_string = "Sodium-ion batteries are among the most promising alternatives to lithium-based " \
               "technologies for grid and other energy storage applications due to their cost benefits " \
               "and sustainable resource supply. For the cathode—the component that largely determines the " \
               "energy density of a sodium-ion battery cell—one major category of materials is P2-type layered oxides."
res = nlp(input_string)

# Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.