Replies: 3 comments
-
|
Which version of the package are you using? Older versions of the package might be falling back to DataParallel when running on a multi-GPU setup, which dramatically slows down fine-tuning. You can either set |
Beta Was this translation helpful? Give feedback.
-
|
@Claire-bx can you install |
Beta Was this translation helpful? Give feedback.
-
|
想问一下,怎么根据我的数据去估算微调需要的 num_steps |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello Chronos team,
I am fine-tuning Chronos-2 with LoRA and noticed that GPU utilization remains 0% throughout training when using two A100 80GB GPUs. I would like to confirm whether this behavior is expected or if I am missing something.
Model Loading
Fine-tuning Code
Observed GPU Behavior (nvidia-smi)
GPU utilization stays at 0% during the entire training process
Beta Was this translation helpful? Give feedback.
All reactions