-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
incompatible version between spark-jobserver and Dependencies file #86
Comments
You need to look at spark-jobserver/spark-jobserver. -Evan
|
I have encountered similar issue. It appears from the Dependencies.scala file history that spark core version is downgraded from 1.0.2. I am getting the following error when I tried to post a job on my work station. I have Spark 1.0.2 running. java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map; Thanks, |
i changed the Dependencies file, and it looks working 2014-11-04 20:03 GMT+01:00 mvenkatm [email protected]:
|
The problem that I am facing is resolved after updating the spark version in dependencies.scala to 1.0.2 |
I an using spark-jobserver 0.4.0 and my spark server is on 1.0.2, and when i see the README file , it 's look fine. but when i see the project/Dependencies.scala i find this line :
"org.apache.spark" %% "spark-core" % "0.9.1" % "provided" exclude("io.netty", "netty-all"),
is it normal to have spark core 0.9.1 with spark-jobserver 0.4.0 ?
The text was updated successfully, but these errors were encountered: