YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
Model Card for qa-expert-7B-V1.0-GGUF
This repo contains the GGUF format model files for khaimaitien/qa-expert-7B-V1.0.
You can get more information about how to use/train the model from this repo: https://github.com/khaimt/qa_expert
Model Sources [optional]
- Repository: [https://github.com/khaimt/qa_expert]
How to Get Started with the Model
First, you need to clone the repo: https://github.com/khaimt/qa_expert
Then install the requirements:
pip install -r requirements.txt
Then install llama-cpp-python
Here is the example code:
from qa_expert import get_inference_model, InferenceType
def retrieve(query: str) -> str:
# You need to implement this retrieval function, input is a query and output is a string
# This can be treated as the function to call in function calling of OpenAI
return context
model_inference = get_inference_model(InferenceType.llama_cpp, "qa-expert-7B-V1.0.q4_0.gguf")
answer, messages = model_inference.generate_answer(question, retriever_func)
- Downloads last month
- 40