MobileSAM model
This model repository contains the weights of the MobileSAM model.
Installation
First install the MobileSAM package:
git clone -b add_mixin https://github.com/NielsRogge/MobileSAM.git
cd MobileSAM
Usage
The model can then be used as follows:
from mobile_sam import MobileSAM, SamPredictor
import torch
model = MobileSAM.from_pretrained("nielsr/mobilesam")
# perform inference
device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device=device)
predictor = SamPredictor(model)
predictor.set_image(<your_image>)
masks, _, _ = predictor.predict(<input_prompts>)
- Downloads last month
- 9
Inference API (serverless) does not yet support transformers models for this pipeline type.