Built with Axolotl

An instruct based fine tune of migtissera/Tess-34B-v1.4.

It works well with long system prompts.

It isn't generic in a sense that it shouldn't be used for story telling, for example, but only for reasoning and text comprehension.

This model is trained on a private dataset. The high GSM8K score is NOT because of the MetaMath dataset.

Prompt Format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:

Quants:

TheBloke/Pallas-0.5-GGUF

TheBloke/Pallas-0.5-AWQ

TheBloke/Pallas-0.5-GPTQ

LoneStriker/Pallas-0.5-3.0bpw-h6-exl2

LoneStriker/Pallas-0.5-4.0bpw-h6-exl2

LoneStriker/Pallas-0.5-4.65bpw-h6-exl2

LoneStriker/Pallas-0.5-5.0bpw-h6-exl2

LoneStriker/Pallas-0.5-6.0bpw-h6-exl2

LoneStriker/Pallas-0.5-8.0bpw-h8-exl2

Downloads last month
2,584
Safetensors
Model size
34.4B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Mihaiii/Pallas-0.5

Finetuned
(10)
this model
Finetunes
4 models
Quantizations
6 models

Collection including Mihaiii/Pallas-0.5