-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with autocache funciton #86
Comments
Turn out that the spark does not distribute the jar file to executor automatically. I need to set the following option |
But then another issue shows up, which is about the CUDAManager
Again disabling the |
Spark cannot find the class CacheGPUDS in cluster mode. Here is the error message:
I am using Ubuntu 17.04, JRE 1.8.0, Scala 2.11 and Spark 2.1.0. This error only shows when I submit it through a spark cluster url, but doesn't appear when submit using local[*] as master.
I can actually eliminate the error when disable the spark.gpuenabler.autocache in spark configuration. However, I also hope it can cache the object. Is there a better way to fix the issue?
Also, It seems that the error happened in executor side, since I can print out the class in the code, but when it get from driver to executor, it can't find the class anymore.
The text was updated successfully, but these errors were encountered: