You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
#11
Thanks for the report! I will try my best to add a feature to be capable of doing inference on CPU-only devices. Meanwhile, in ComfyUI-APISR, they have provided a way to use the CPU. However, this repo is in Chinese. You can have a look at that and it may be possible that it is adaptive to Mac.
Another solution is to use the colab provided to use free account credit from Google to do inference online. Thanks!
I have resolved this issue. You just need to add a few lines of code like the screenshots. I've tried this and I'm able to successfully run the inference on a MacBook Air M2.
hope support mac ...
The text was updated successfully, but these errors were encountered: