You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run the python scripts/knn2img.py, but I constantly get the following error.
Traceback (most recent call last): File "scripts/knn2img.py", line 314, in <module> model = load_model_from_config(config, f"{opt.ckpt}") File "scripts/knn2img.py", line 60, in load_model_from_config model.cuda() File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\lightning_fabric\utilities\device_dtype_mixin.py", line 72, in cuda device = torch.device("cuda", torch.cuda.current_device()) File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\torch\cuda\__init__.py", line 778, in current_device _lazy_init() File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
Please kindly advise me on how to fix this.
The text was updated successfully, but these errors were encountered:
I'm trying to run the python scripts/knn2img.py, but I constantly get the following error.
Traceback (most recent call last): File "scripts/knn2img.py", line 314, in <module> model = load_model_from_config(config, f"{opt.ckpt}") File "scripts/knn2img.py", line 60, in load_model_from_config model.cuda() File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\lightning_fabric\utilities\device_dtype_mixin.py", line 72, in cuda device = torch.device("cuda", torch.cuda.current_device()) File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\torch\cuda\__init__.py", line 778, in current_device _lazy_init() File "C:\Users\sinan\anaconda3\envs\ldm\lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
Please kindly advise me on how to fix this.
The text was updated successfully, but these errors were encountered: