Edit model card

Nayana_base_combined_v1

from transformers import AutoModel, AutoTokenizer
from peft import PeftModel, PeftConfig, AutoPeftModelForCausalLM
from transformers import AutoModelForCausalLM
import torch

tokenizer = AutoTokenizer.from_pretrained('v1v1d/Nayana_base_combined', trust_remote_code=True , torch_dtype=torch.float16)
model = AutoModel.from_pretrained('v1v1d/Nayana_base_combined', trust_remote_code=True, low_cpu_mem_usage=True, device_map='cuda', use_safetensors=True, pad_token_id=tokenizer.eos_token_id , torch_dtype=torch.float16)


model = model.eval().cuda()

image_file = 'hindi.png'
res = model.chat(tokenizer, image_file, ocr_type='ocr' , render=True, stream_flag = True)

print(res)
Downloads last month
62
Safetensors
Model size
561M params
Tensor type
FP16
·
Inference API
Inference API (serverless) does not yet support model repos that contain custom code.