--- base_model: - mergekit-community/mergekit-model_stock-olgorhm - Cran-May/SCE-2-24B - mergekit-community/mergekit-model_stock-nrrhivg - yentinglin/Mistral-Small-24B-Instruct-2501-S1-SFT library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Cran-May/SCE-2-24B](https://huggingface.co./Cran-May/SCE-2-24B) as a base. ### Models Merged The following models were included in the merge: * [mergekit-community/mergekit-model_stock-olgorhm](https://huggingface.co./mergekit-community/mergekit-model_stock-olgorhm) * [mergekit-community/mergekit-model_stock-nrrhivg](https://huggingface.co./mergekit-community/mergekit-model_stock-nrrhivg) * [yentinglin/Mistral-Small-24B-Instruct-2501-S1-SFT](https://huggingface.co./yentinglin/Mistral-Small-24B-Instruct-2501-S1-SFT) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Cran-May/SCE-2-24B - model: mergekit-community/mergekit-model_stock-olgorhm parameters: density: 0.9 weight: 0.5 - model: mergekit-community/mergekit-model_stock-nrrhivg parameters: density: 0.9 weight: 0.5 - model: yentinglin/Mistral-Small-24B-Instruct-2501-S1-SFT parameters: density: 0.8 weight: 0.3 merge_method: model_stock base_model: Cran-May/SCE-2-24B parameters: int8_mask: true normalize: true dtype: float16 ```