You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not really sure whether that is possible because we have our own Tensor class which wraps around the torch.Tensor, and similarly all the PyTorch functions are wrapped inside RF.
However, I found out that there is TensorDict, which also wraps around PyTorch tensors, and it is explicitly stated that this is compatible with torch.compile, so maybe it is possible. In that case, it behaves like a dict, but does not inherit from dict, it just inherits from collections.abc.MutableMapping, see here.
So, let us discuss here possibilities and options to support torch.compile directly with the RF PyTorch backend.
I'm not really sure whether that is possible because we have our own
Tensor
class which wraps around thetorch.Tensor
, and similarly all the PyTorch functions are wrapped inside RF.However, I found out that there is TensorDict, which also wraps around PyTorch tensors, and it is explicitly stated that this is compatible with
torch.compile
, so maybe it is possible. In that case, it behaves like a dict, but does not inherit from dict, it just inherits fromcollections.abc.MutableMapping
, see here.So, let us discuss here possibilities and options to support
torch.compile
directly with the RF PyTorch backend.Related:
torch.compile
for PyTorch in general (not specific about RF): PyTorchtorch.compile
, scripting, tracing for faster computation (specifically training) #1436The text was updated successfully, but these errors were encountered: