Maginum-Cydoms-24B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Methods
This model was merged using the following merge methods:
Models Merged
The following models were included in the merge:
- anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
- TheDrummer/Magidonia-24B-v4.3
- TheDrummer/Precog-24B-v1
- zerofata/MS3.2-PaintedFantasy-v3-24B
- TheDrummer/Cydonia-24B-v4.3
- ReadyArt/4.2.0-Broken-Tutu-24b
- zerofata/MS3.2-PaintedFantasy-v2-24B
Configuration
The following YAML configurations were used to produce this model:
Maginum-Cydoms-S001:
models:
- model: anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
- model: TheDrummer/Magidonia-24B-v4.3
parameters:
density: 1.0
weight: 1.0
- model: TheDrummer/Precog-24B-v1
parameters:
density: 0.4
weight: 0.6
- model: zerofata/MS3.2-PaintedFantasy-v3-24B
parameters:
density: 0.4
weight: 0.4
merge_method: ties
base_model: anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
parameters:
normalize: false
int8_mask: false
dtype: float32
tokenizer:
source: union
Maginum-Cydoms-S002:
models:
- model: anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
- model: TheDrummer/Cydonia-24B-v4.3
parameters:
density: 1.0
weight: 1.0
epsilon: 0.0
- model: ReadyArt/4.2.0-Broken-Tutu-24b
parameters:
density: 0.4
weight: 0.6
epsilon: 0.2
- model: zerofata/MS3.2-PaintedFantasy-v2-24B
parameters:
density: 0.4
weight: 0.4
epsilon: 0.2
merge_method: della
base_model: anthracite-core/Mistral-Small-3.2-24B-Instruct-2506-Text-Only
parameters:
normalize: false
int8_mask: false
dtype: float32
tokenizer:
source: union
models:
- model: Maginum-Cydoms-S001
- model: Maginum-Cydoms-S002
merge_method: slerp
base_model: Maginum-Cydoms-S001
parameters:
t:
- filter: self_attn
value: [0.3, 0.4, 0.6, 0.4, 0.3, 0.4, 0.6, 0.4, 0.3]
- filter: mlp
value: [0.7, 0.6, 0.4, 0.6, 0.7, 0.6, 0.4, 0.6, 0.7]
- value: 0.5
dtype: float32
out_dtype: bfloat16
tokenizer:
source: union
- Downloads last month
- 88