-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Confirmation: --no DLIB_USE_CUDA for GPU support? #6
Comments
It's basically just something I copied and pasted that got dlib, (and thereby face_recognition) to use my GPU. You ran all the commands for it in the readme? What kind of system do you have? |
Have copy and pasted everything as in the read me. No errors. Nvidia-smi shows that the python3 process is using 69mb but the volatile GPU usage is 1-4% Ubuntu 16.04. Cuda 8. GTX 980 4g. If I compile it like this: `mkdir build; cd build; cmake .. -DLIB_USE_CUDA=1 -DUSE_AVX_INSTRUCTIONS=1; cmake --build . cd .. python3 setup.py install --yes USE_AVX_INSTRUCTIONS I get Cuda support. But I wonder how your compile does support CUDA even if you explicitly say that the compiler should not to compile with Cuda support (-DDLIB....=0) |
I just found this on githubgist: https://gist.github.com/ageitgey/629d75c1baac34dfa5ca2a1928a7aeaf
So your instructions clearly do not compile for CUDA. EDIT: I managed to compile dlib with CUDA support. I tried a fresh virtualenv. Seems like I already managed to compile it before but was not aware CUDA was being used as nvidia-smi showed only 300mb used from VRAM. Anyways, here it goes:
In this process I updated my cudnn from 6.0.21 to 7.0.5 using this
Read more at: http://docs.nvidia.com/deeplearning/sdk/cudnn-install/index.html#ixzz587Lymz7E After that:
The log shows that CUDA support is compiled and checks if correct cudnn library is present working. |
I think I closed this to early because I still think it does not utilize CUDA.
I do run three instances and the GPU fan isn't even spinning. GPU Utilization is between 0-5%.
and it starts checking the frames and writing images. |
Your nvidia-smi looks like it's saying it's using your GPU. I wish I had my Linux machine right now. I won't have it hooked up until tomorrow. The Terminal won't display whether or not the GPU is being used when you run demo.py. I have a 1080ti and checking and cropping the images in this script is one of the slower things I get the GPU to do in python. |
@MotorCityCobra You state in your readme the compile option
-DDLIB_USE_CUDA=0
and when installing--no DLIB_USE_CUDA
Are you sure? Because when compiling with this argument and running the process my GPU isn't used at all. Extract process runs fine nevertheless but CPU is maxed out.
The text was updated successfully, but these errors were encountered: