-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fastai imports are deprecated #2
Comments
Also, |
To expand on this, there are issues with the current stable version of fastai v2. For example, this bit of code from the README in the amalgam repo no longer works:
It throws this error:
|
Hey, sorry I seemed to have missed this issue. We've changed the package name to |
You'll also need to change the bits of your code that go |
Thanks, @rsomani95! I tried that, and I'm still getting the same error as above. I'm using Google Colab (no GPU) with:
My model is a cnn_learner built off resnet50. |
@willjobs Can you try using the regular fastai Does your |
I just tried replicating what you did in the interpret_gradcam.ipynb notebook, and I get all the way down to plot_gcam() which throws the error: "RuntimeError: number of dims don't match in permute". |
Yup, I just tried it and got the same error with |
Ah, that's actually deprecated, I re-implemented a different interface using the same underlying components. Try |
This is odd. It's something on the |
Gotcha. I just tried this (
OK, will do. I wonder if the issue is that I'm using a ResNet? It has adaptive pooling and two Linear layers after the last convolutional layer, and I'm not sure exactly how that works with gradcam. The network I'm using has this structure:
|
This may be what's causing the error in learn = cnn_learner(...)
learn.gradcam(..., target_layer = learn.model[1][0]) I'd tinker with the |
Good idea. It works with I also tried Also, thank you for all your help on this! I really appreciate it. |
Ahhh, that's weird. I'll have to take a deeper look at this to figure out what the issue is. I've only tested this with a
|
Will do! |
With the new stable version when:
from fastai2_extensions.interpret.all import *
it throws an error:
The text was updated successfully, but these errors were encountered: