You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Will it be possible in the future for you to coordinate with the torchtune project so that we are able to use A for xyz and B for ikj?
We've been using litgpt a lot appreciating the nice balance between minimal modeling implementation and key quality of life features for distributed LLM training through the lightning ecosystem.
It seems like since torchtune, lightning, and litgpt are both at varying levels of torch nativeness and the tools for pretraining and finetuning do tend to diverge a bit, it would be cool for us as users of such toolkits if there's an effort to double down on this distinction so that we get optimal utility in all cases without heavy duplication of functionality across libraries.
I already see some nods to litgpt in the torchtune codebase but I wonder how intentional or open the channels of communication are here.
Will it be possible in the future for you to coordinate with the
torchtune
project so that we are able to use A for xyz and B for ikj?We've been using
litgpt
a lot appreciating the nice balance between minimal modeling implementation and key quality of life features for distributed LLM training through the lightning ecosystem.It seems like since
torchtune
,lightning
, andlitgpt
are both at varying levels of torch nativeness and the tools for pretraining and finetuning do tend to diverge a bit, it would be cool for us as users of such toolkits if there's an effort to double down on this distinction so that we get optimal utility in all cases without heavy duplication of functionality across libraries.I already see some nods to
litgpt
in thetorchtune
codebase but I wonder how intentional or open the channels of communication are here.@carmocca
🤝
@rohan-varma
The text was updated successfully, but these errors were encountered: