Zeppelin spark.executor.extraClassPath 和 --driver-class-path 配置冲突问题

来源:互联网 发布:数据采集卡软件 编辑:程序博客网 时间:2024/06/08 01:21

报错如下:


WARN [2017-06-27 15:47:59,777] ({pool-2-thread-2} Logging.scala[logWarning]:66) - 

SPARK_CLASSPATH was detected (set to '/home/raini/spark/lib/mysql-connector-java-5.1.38-bin.jar:').
This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with --driver-class-path to augment the driver classpath
 - spark.executor.extraClassPath to augment the executor classpath
        

 WARN [2017-06-27 15:47:59,778] ({pool-2-thread-2} Logging.scala[logWarning]:66) - Setting 'spark.executor.extraClassPath' to '/home/raini/spark/lib/mysql-connector-java-5.1.38-bin.jar:' as a work-around.
ERROR [2017-06-27 15:47:59,780] ({pool-2-thread-2} Logging.scala[logError]:91) - Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:543)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:541)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:541)
at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:529)






解决:
1.提交的作业可以通过加入--driver-class-path参数来设置driver的classpath。

$  bin/spark-submit --master local[2]  --driver-class-path lib/mysql-connector-java-5.1.35.jar --class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar


2.其实,我们还可以在spark安装包的conf/spark-env.sh通过配置SPARK_CLASSPATH来设置driver的环境变量,如下:

export SPARK_CLASSPATH=$SPARK_CLASSPATH:/iteblog/com/mysql-connector-java-5.1.35.jar


这样也可以解决上面出现的异常。但是,我们不能同时在conf/spark-env.sh里面配置SPARK_CLASSPATH和提交作业加上–driver-class-path参数,否则会出现以上异常。


所以,删掉一个配置即可,这里删掉了spark配置项:export SPARK_CLASSPATH=...




原创粉丝点击