YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
Model summary
Train flan-T5-large on alpaca dataset with LoRA
training
- torch==2.0.0+cu117
- transformers==4.28.0.dev0
- 8 x V100 32G
How to use
import transformers
from peft import PeftModel
base_model = transformers.AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-large")
peft_model = PeftModel.from_pretrained("zirui3/flan-t5-large-alpaca")
inputs = tokenizer("Any instruction that you like.", return_tensors="pt")
outputs = peft_model.generate(**inputs, max_length=128, do_sample=True)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)