Luminum-v0.1-123B / README.md
FluffyKaeloky's picture
Upload folder using huggingface_hub
4d02594 verified
|
raw
history blame
975 Bytes
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge

LuminumMistral-123B

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the della_linear merge method using mistralaiMistral-Large-Instruct-2407 as a base.

Models Merged

The following models were included in the merge:

  • NeverSleepLumimaid-v0.2-123B
  • anthracite-orgmagnum-v2-123b

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: anthracite-orgmagnum-v2-123b
    parameters:
      weight: 0.19
      density: 0.5
  - model: NeverSleepLumimaid-v0.2-123B
    parameters:
      weight: 0.34
      density: 0.8
merge_method: della_linear
base_model: mistralaiMistral-Large-Instruct-2407
parameters:
  epsilon: 0.05
  lambda: 1
  int8_mask: true
dtype: bfloat16