--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - TURKCELL/Turkcell-LLM-7b-v1 - Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0 --- # berke-tr-slerp-merge-7B berke-tr-slerp-merge-7B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [TURKCELL/Turkcell-LLM-7b-v1](https://huggingface.co./TURKCELL/Turkcell-LLM-7b-v1) * [Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0](https://huggingface.co./Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0) ## Configs \```yaml slices: - sources: - model: TURKCELL/Turkcell-LLM-7b-v1 layer_range: [0, 32] - model: Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0 layer_range: [0, 32] merge_method: slerp base_model: mistralai/Mistral-7B-Instruct-v0.2 parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.7, 0.7] - value: 0.7 dtype: bfloat16 \```