Model Details
- Base Model: meta-llama/Llama-3.1-8B-instruct
- SFT
Datasets:
128st customized set (combined synthatic/human writing)
- 1K Math from continuation of llama345
Source Adapters
All source adapters share the following configuration:
- Rank (r): 16
- Alpha: 16
- Target Modules:
- q_proj (Query projection)
- k_proj (Key projection)
- v_proj (Value projection)
- o_proj (Output projection)
- up_proj (Upsampling projection)
- down_proj (Downsampling projection)
- gate_proj (Gate projection)
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.