关于spark-submit报错java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize

来源:互联网 发布:精彩返奖统计软件 编辑:程序博客网 时间:2024/05/21 12:49

解决方案写在前面:将 Scala 的版本改成 2.11.8,环境是 Spark 2.1.0
当使用 spark-submit 提交 Scala App 时,代码仅仅做了一个filter或者map操作。然后报了一长串的错。

17/04/23 08:02:48 INFO DAGScheduler: ResultStage 0 (first at Main.scala:17) failed in 1.981 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.1.100, executor 0): java.io.IOException: unexpected exception type        at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582)        at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154)        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2022)        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)        at org.apache.spark.scheduler.Task.run(Task.scala:99)        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)        at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.reflect.InvocationTargetException        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:498)        at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:498)        at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148)        ... 23 moreCaused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize        at Main$.$deserializeLambda$(Main.scala)        ... 33 moreCaused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize

看起来像是Lambda表达式的问题。
但是在 spark-shell 里却是正常的。
由 http://blog.csdn.net/u013054888/article/details/54600229 得知可能是 Scala 版本的问题。我在 sbt 中写的是 2.12.2 。从 spark-shell 里可以看出应该使用 Scala-2.11.8 ,在 sbt 中把版本号改成 2.11.8 就一切和谐了。

0 0
原创粉丝点击