Fine-tune Mixtral 8x7b on AWS SageMaker and Deploy to RunPod
Mlearning.ai
DECEMBER 22, 2023
Overall metrics and parameters: Dataset size: 10k examples Language: pt-BR Domain: Healthcare Num of epochs: 1 Batch size: 1 Learning rate: 2e-4 Lora config - r: 256 Lora config - alpha: 128 bf16: False == Training time: 30500 seconds | 8.47 The ml.g5.24xlarge instance we used costs $18.18 per hour for on-demand training jobs.
Let's personalize your content