Any plans to support tensor parallelism ? #17932
Unanswered
vikigenius
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment
-
Yes, this would be helpful to understand! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The strategy API kind of seems limited and I am not sure if it is even worth it to use PyTorch lightning if you want Tensor Parallelism.
Is there any plans for supporting Tensor Parallelism: https://pytorch.org/docs/stable/distributed.tensor.parallel.html
https://huggingface.co/transformers/v4.9.0/parallelism.html#tensor-parallelism
Beta Was this translation helpful? Give feedback.
All reactions