Model Training Configuration
This document outlines the configuration parameters used for training the model.
Training Parameters
- Epochs: 2
- Base Model: LLAMA3-8B Instruct
LoRA Configuration
lora_alpha: 8lora_dropout: 0.05lora_rank: 8
General Configuration
seed: 42adam_epsilon: 1e-08lr_scheduler_type: "linear"logging_steps: 100save_steps: 1000save_strategy: "steps"evaluation_strategy: "steps"eval_steps: 100max_seq_length: 8192
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support