Untitled LoRA Model (1)

This is a LoRA extracted from a language model. It was extracted using mergekit.

LoRA Details

This LoRA adapter was extracted from FPHam/L3-8B-Everything-COT and uses NousResearch/Meta-Llama-3.1-8B-Instruct as a base.

Parameters

The following command was used to extract this LoRA adapter:

mergekit-extract-lora FPHam/L3-8B-Everything-COT NousResearch/Meta-Llama-3.1-8B-Instruct OUTPUT_PATH --no-lazy-unpickle --skip-undecomposable --rank=128 --verbose
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for DreadPoor/Everything-COT-8B-r128-LoRA

Finetuned
(1)
this model
Merges
7 models

Collection including DreadPoor/Everything-COT-8B-r128-LoRA