For more details please refer to the original github repo: https://github.com/FlagOpen/FlagEmbedding

BGE-M3 (paper, code)

This is the original model with O2 optimization applied after quantization. Hub

Downloads last month
64
Inference Examples
Inference API (serverless) does not yet support onnx models for this pipeline type.

Model tree for juampahc/bge-m3-baai-quant-opt

Base model

BAAI/bge-m3
Quantized
(19)
this model