Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU. #11

Open
dnvs opened this issue Apr 8, 2024 · 2 comments

Comments

@dnvs
Copy link

dnvs commented Apr 8, 2024

hope support mac ...

@Kiteretsu77
Copy link
Owner

Kiteretsu77 commented Apr 8, 2024

Thanks for the report! I will try my best to add a feature to be capable of doing inference on CPU-only devices. Meanwhile, in ComfyUI-APISR, they have provided a way to use the CPU. However, this repo is in Chinese. You can have a look at that and it may be possible that it is adaptive to Mac.
Another solution is to use the colab provided to use free account credit from Google to do inference online. Thanks!

@alexaex
Copy link

alexaex commented May 29, 2024

I have resolved this issue. You just need to add a few lines of code like the screenshots. I've tried this and I'm able to successfully run the inference on a MacBook Air M2.
image
image
image
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants