Built with Axolotl

An instruct based fine tune of migtissera/Tess-34B-v1.4.

It works well with long system prompts.

It isn't generic in a sense that it shouldn't be used for story telling, for example, but only for reasoning and text comprehension.

This model is trained on a private dataset. The high GSM8K score is NOT because of the MetaMath dataset.

Prompt Format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:
Downloads last month
25
Safetensors
Model size
34.4B params
Tensor type
FP16
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Mihaiii/Pallas-0.3

Finetuned
(10)
this model
Quantizations
3 models

Collection including Mihaiii/Pallas-0.3