spark无法启动

来源:互联网 发布:orcl 删除重复数据 编辑:程序博客网 时间:2024/06/05 19:20

查看logs,错误描述:

Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.spark.depl
oy.worker.Worker
   at java.lang.Class.initializeClass(libgcj.so.10)
Caused by: java.lang.ClassNotFoundException: scala.Function1 not found in gnu.gc
j.runtime.SystemClassLoader{urls=[file:./,file:/home/hxf/spark/spark-1.2.0-bin-h
adoop2.4/sbin/../conf/,file:/home/hxf/spark/spark-1.2.0-bin-hadoop2.4/lib/spark-
assembly-1.2.0-hadoop2.4.0.jar,file:/home/hxf/spark/spark-1.2.0-bin-hadoop2.4/li
b/datanucleus-api-jdo-3.2.6.jar,file:/home/hxf/spark/spark-1.2.0-bin-hadoop2.4/l
ib/datanucleus-rdbms-3.2.9.jar,file:/home/hxf/spark/spark-1.2.0-bin-hadoop2.4/li
b/datanucleus-core-3.2.10.jar], parent=gnu.gcj.runtime.ExtensionClassLoader{urls
=[], parent=null}}
   at java.net.URLClassLoader.findClass(libgcj.so.10)
   at java.lang.ClassLoader.loadClass(libgcj.so.10)
   at java.lang.ClassLoader.loadClass(libgcj.so.10)
   at java.lang.Class.initializeClass(libgcj.so.10)

分析错误原因,与libgcj.so.10有关,可能是Java环境没有配置正确,在conf/spark-env.sh中添加一行:export JAVA_HOME=/usr/java/latest解决问题



0 0