File size: 1,977 Bytes
dc40df3 7c710ba dc40df3 7c710ba 0e21dc8 040eb45 0e21dc8 df64427 7c710ba dc40df3 7c710ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
---
base_model:
- PJMixers/LLaMa-3-CursedStock-v1.8-8B
- bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- tannedbum/L3-Nymeria-8B
- v000000/L3-8B-Poppy-Sunspice
- Nitral-AI/Hathor_RP-v.01-L3-8B
library_name: transformers
tags:
- mergekit
- merge
- llama
---
### test-test-test-test request
```bash
num_battles: 40982
num_wins: 23536
celo_rating: 1214.08
safety_score: 0.96
propriety_score: 0.7065322049358942
propriety_total_count: 19733.0
```
# Quants:
* [Q8_0 imatrix, FP16 GGUF](https://huggingface.co./v000000/L3-8B-UGI-DontPlanToEnd-test-GGUF)
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [tannedbum/L3-Nymeria-8B](https://huggingface.co./tannedbum/L3-Nymeria-8B) as a base.
### Models Merged
The following models were included in the merge:
* [PJMixers/LLaMa-3-CursedStock-v1.8-8B](https://huggingface.co./PJMixers/LLaMa-3-CursedStock-v1.8-8B)
* [bluuwhale/L3-SthenoMaidBlackroot-8B-V1](https://huggingface.co./bluuwhale/L3-SthenoMaidBlackroot-8B-V1)
* [v000000/L3-8B-Poppy-Sunspice](https://huggingface.co./v000000/L3-8B-Poppy-Sunspice)
* [Nitral-AI/Hathor_RP-v.01-L3-8B](https://huggingface.co./Nitral-AI/Hathor_RP-v.01-L3-8B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: PJMixers/LLaMa-3-CursedStock-v1.8-8B
parameters:
weight: 0.15
density: 0.57
- model: v000000/L3-8B-Poppy-Sunspice
parameters:
weight: 0.20
density: 0.69
- model: Nitral-AI/Hathor_RP-v.01-L3-8B
parameters:
weight: 0.30
density: 0.8
- model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
parameters:
weight: 0.35
density: 0.85
merge_method: dare_ties
base_model: tannedbum/L3-Nymeria-8B
parameters:
int8_mask: true
dtype: bfloat16
``` |