merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using MrRobotoAI/10 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model: MrRobotoAI/Triangle104-Hermes3-L3.1-BigTalker-8B
models:
- model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
density: 1
weight: 1
- model: DavidAU/L3.1-RP-Hero-BigTalker-8B
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
normalize: true
int8_mask: true
dtype: float16
The following YAML configuration was used to produce this model: MrRobotoAI/Triangle104-Hermes3-L3.1-DirtyHarry-8B
models:
- model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
density: 1
weight: 1
- model: DavidAU/L3.1-RP-Hero-Dirty_Harry-8B
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
normalize: true
int8_mask: true
dtype: float16
The following YAML configuration was used to produce this model: MrRobotoAI/Triangle104-Hermes3-L3.1-DarkPlanetSF-8B
models:
- model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
density: 1
weight: 1
- model: DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: NousResearch/Hermes-3-Llama-3.1-8B
parameters:
normalize: true
int8_mask: true
dtype: float16
Models Merged
The following models were included in the merge:
- DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B
- MrRobotoAI/Triangle104-Hermes3-L3.1-DarkPlanetSF-8B
- DavidAU/L3.1-Instruct-Guru-8B
- MrRobotoAI/Triangle104-Hermes3-L3.1-DirtyHarry-8B
- DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power
- MrRobotoAI/Triangle104-Hermes3-L3.1-BigTalker-8B
Configuration
The following YAML configuration was used to produce this model: MrRobotoAI/DavidAU-Dark-Planet-of-Davids-8B-64k
models:
- model: DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B
parameters:
density: 0.25
weight: 0.7
- model: DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power
parameters:
density: 0.25
weight: 0.7
- model: DavidAU/L3.1-Instruct-Guru-8B
parameters:
density: 0.25
weight: 0.7
- model: MrRobotoAI/Triangle104-Hermes3-L3.1-BigTalker-8B
parameters:
density: 0.25
weight: 0.7
- model: MrRobotoAI/Triangle104-Hermes3-L3.1-DirtyHarry-8B
parameters:
density: 0.25
weight: 0.7
- model: MrRobotoAI/Triangle104-Hermes3-L3.1-DarkPlanetSF-8B
parameters:
density: 0.25
weight: 0.7
merge_method: ties
base_model: MrRobotoAI/10
parameters:
normalize: true
int8_mask: true
dtype: float16
- Downloads last month
- 21
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for MrRobotoAI/DavidAU-Dark-Planet-of-Davids-8B-64k
Merge model
this model