Suggestion to Out-Of-Memory Error (with 4 GPU) #1051
Unanswered
KaiserWhoLearns
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to only use
T5-smallfor fine-tuning model on a custom dataset (intsvformat) with 4 GPUs but keep getting Out-Of-Memory error.I believe this is very unlikely as my GPU should have more than enough memory to fine-tune the small model, so there should be something wrong with the command I used:
Any suggestion on what I can modify to resolve the issue?
(It looks like it is out of memory on the first GPU, so maybe I happened to used a invalid command for multi-GPU?)
Beta Was this translation helpful? Give feedback.
All reactions