Spark On Yarn 提交任务报错ERROR SparkContext: Error initializing SparkContext.

来源:互联网 发布:new event js 编辑:程序博客网 时间:2024/05/20 07:35

spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client $SPARK_HOME/examples/jars/spark-examples_2.11-2.2.0.jar 100

出现的错误如下:

ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.

在hadoop ui上查看任务详情

2.2g的虚拟内存实际值,超过了2.1g的上限,所以contrainer被杀死了

解决:在yarn-site.xml中加入配置

 <property>      <name>yarn.nodemanager.vmem-check-enabled</name>      <value>false</value> </property>
重启Hadoop和spark重新提交


阅读全文
0 0
原创粉丝点击