Oozie 4.3.0运行Error: …

来源:互联网 发布:oracle 数据库日志 编辑:程序博客网 时间:2024/06/05 09:38
fs://master2host:9000/user/master2/share/lib/spark/py4j-0.9.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-ipc-1.7.7-tests.jar,hdfs://master2host:9000/user/master2/share/lib/spark/quasiquotes_2.10-2.0.0-M8.jar,hdfs://master2host:9000/user/master2/share/lib/spark/scalap-2.10.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-streaming-flume_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/scala-library-2.10.5.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jaxb-api-2.2.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/kafka-clients-0.8.2.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/kryo-2.22.jar,hdfs://master2host:9000/user/master2/share/lib/spark/slf4j-log4j12-1.6.6.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jodd-core-3.5.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/commons-codec-1.4.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jackson-databind-2.4.4.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jetty-6.1.14.jar,hdfs://master2host:9000/user/master2/share/lib/spark/curator-recipes-2.5.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/log4j-1.2.17.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-graphx_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-1.7.7.jar,hdfs://master2host:9000/user/master2/share/lib/spark/parquet-column-1.7.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-streaming_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-unsafe_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-launcher_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/commons-logging-1.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jetty-util-6.1.26.jar,hdfs://master2host:9000/user/master2/share/lib/spark/tachyon-underfs-hdfs-0.8.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/parquet-hadoop-1.7.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-ipc-1.7.7.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/json-simple-1.1.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/oozie-hadoop-utils-hadoop-2-4.3.0.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/oozie-sharelib-oozie-4.3.0.jar
  pyFiles             file:/home/master2/hadoop_tmp/nm-local-dir/usercache/master2/appcache/application_1487340758413_0003/container_1487340758413_0003_01_000001/pyspark.zip,file:/home/master2/hadoop_tmp/nm-local-dir/usercache/master2/appcache/application_1487340758413_0003/container_1487340758413_0003_01_000001/py4j.zip
  archives             null
  mainClass            null
 primaryResource       hdfs://master2host:9000/user/master2/examples/apps/pythonApp/lib/spark1.py
  name                Spark-python
  childArgs            []
  jars                null
  packages             null
 packagesExclusions     null
  repositories         null
  verbose              true

Spark properties used,including those specified through
 --conf andthose from the properties file null:
 spark.yarn.security.tokens.hive.enabled ->false
 spark.yarn.jar -> null
 spark.yarn.tags ->oozie-716cb74f8eb05f10a1382d221c0f2c90
 spark.executor.extraJavaOptions ->-Dlog4j.configuration=spark-log4j.properties
 spark.yarn.security.tokens.hbase.enabled ->false
 spark.driver.extraJavaOptions ->-Dlog4j.configuration=spark-log4j.properties
 spark.executor.extraClassPath -> $PWD/*
 spark.driver.extraClassPath -> $PWD/*

   
Error: Could not load YARNclasses. This copy of Spark may not have been compiled with YARNsupport.
Run with --help for usage helpor --verbose for debug output
InterceptingSystem.exit(1)
Failing Oozie Launcher, Mainclass [org.apache.oozie.action.hadoop.SparkMain], exit code[1]
log4j:WARN No appenders couldbe found for logger(org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initializethe log4j system properly.
log4j:WARN Seehttp://logging.apache.org/log4j/1.2/faq.html#noconfig for moreinfo.

  我的spark使用的是2.11版本。
  遇到这样的问题,需要更新下hdfs里面的oozie的sharelib,尤其spark文件夹下的。之前spark文件夹下一直是oozie默认编译出来的库,不是最新的2.11,因此找不到spark与yarn相关的jar。
然而它居然自作主张地提示错误说可能spark没有在添加yarn支持的情况下编译。我明明下载的版本就是spark onhadoop的,怎么可能不支持?!
  其实就是oozie自己类库不支持。这个oozie逻辑真是想不通了,你需要别人的类库,直接链接过去用就完了,为何要自己编译一遍?!如果说你非得上传到hdfs中,那也是应该通过shell命令一股脑地拷贝过来啊!
  这些配置过程全部都应该自动化了。一个比较好的思维就是,我系统里有什么,你就拿什么,别自作主张拿其它版本的东西。

与此同时,在oozie-site.xml配置文件里面,也得加上如下支持,才能让spark跑得比较完美:
<property> 
       <name>oozie.service.SparkConfigurationService.spark.configurations</name> 
       <value>*=/usr/local/spark/conf</value> 
   </property> 
  
   <property> 
       <name>oozie.service.WorkflowAppService.system.libpath</name> 
       <value>/user/oozie/share/lib</value> 
   </property> 
  
   <property> 
      <name>oozie.use.system.libpath</name> 
      <value>true</value> 
      <description> 
             Default value of oozie.use.system.libpath. Ifuser haven't specified =oozie.use.system.libpath= 
             in the job.properties and this value is true andOozie will include sharelib jars for workflow. 
      </description> 
   </property> 

0 0
原创粉丝点击