RoBERTo
Collection
BERT based detection models. Series of RoBERTo models. First models and first text classifier models from Phase Technologies.
β’
2 items
β’
Updated
RoBERTo-Physics-v1-Finetuned is a state-of-the-art text classification model fine-tuned on physics-related corpora. Built upon the robust RoBERTa architecture, this model is designed to classify physics-related text into predefined categories with high accuracy and efficiency.
Metric | Score |
---|---|
Accuracy | 85% |
Precision | 0.82 |
Recall | 0.88 |
F1 Score | 0.85 |
To use this model, install the required dependencies:
pip install transformers torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model_name = "PhaseTechnologies/RoBERT-physics-v1-finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
For demo, visit PhaseTechnologies/RoBERT-physics-v1-finetuned!
!pip install transformers torch
!pip install datasets
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch
model_name = "PhaseTechnologies/RoBERTo-physics-v1-finetuned"
# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
def predict(text):
inputs = tokenizer(text, return_tensors="pt") # Convert text to model input
with torch.no_grad(): # No need to calculate gradients
outputs = model(**inputs) # Pass input to model
return outputs.logits # Return raw predictions
# Example physics-related input
sample_text = "Newton's second law states that force equals mass times acceleration."
logits = predict(sample_text)
print(logits)
from transformers import pipeline
# Load the model
classifier = pipeline("text-classification", model="PhaseTechnologies/RoBERTo-physics-v1-finetuned")
# Perform inference
text = "Newton's second law states that force equals mass times acceleration."
result = classifier(text)
print(result)
This model is the final text classification release from Phase Technologies! π Thank you to all contributors and researchers who made this possible.
For more details, visit Phase Technologies on Hugging Face!
Base model
PhaseTechnologies/RoBERTo