-
-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TorchScript compatibility? #13
Comments
So I've been playing with this for a bit and unfortunately can't get it to work. If you or someone else does manage to get this working, then I'd be happy to accept a PR on it. For posterity:
|
Hi @patrick-kidger, thanks for he quick answer! This level of arcane tinkering with TorchScript definitely sounds familiar to me... 😁 The issue you link in the third bullet does make it look like there is nothing that can be done here until PyTorch resolves the underlying incompatibility with Python. (If I'm understanding this right you couldn't even do EDIT: it looks like fixes to this may have been merged? unclear: pytorch/pytorch#29623 |
Haha! To answer the question, I agree that seems unclear on whether or not that issue is fixed. Either way, because of that or some other issue, our end use case doesn't seem to working at the moment. |
Hi! Is there any updates about that, guys? |
Not that I know about. As far as I know this is still a limitation in torchscript itself. If this is a priority for you then you might like to try bringing this up with the torchscript team. They might know more about any possibilities for making this work. |
I have found a workaround. Let's say you have the following function def f(x: TensorType["batch", "feature"]):
return x.sum() which you want to use in TorchScript. TorchScript does not like generic types in signatures, but we want to keep the dimension annotations somwhere for documentation purposes. We can work around this with a subclass. import torch
from torchtyping import TensorType
class BatchedFeatureTensor(TensorType["batch", "feature"]):
pass
@torch.jit.script
def f(x: BatchedFeatureTensor):
return x.sum()
print(f(torch.tensor([[-1.0, 2.0, 1.2]])))
print(f.code)
# => tensor(2.2000)
# => def f(x: Tensor) -> Tensor:
# => return torch.sum(x) |
Found another way to deal with torchscript. Just paste the code and call
|
Hi all,
This library looks very nice :)
Is
TensorType
compatible with the TorchScript compiler? As in, are the annotations transparently converted totorch.Tensor
as far astorch.jit.script
is concerned, allowing annotated modules/functions to be compiled? (I'm not worried about whether the type checking applied in TorchScript, just whether an annotated program that gets shape-checked in Python can be compiled down to TorchScript.)Thanks!
The text was updated successfully, but these errors were encountered: