Spark-sql执行部分语句是报错:WARN org.apache.spark.scheduler.TaskSchedulerImpl: Initial job has not accepted a

来源:互联网 发布:mysql 序号 编辑:程序博客网 时间:2024/06/18 06:52

Spark-sql执行部分语句是报错:

WARN org.apache.spark.scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources


查看控制台,发现一直有别的Application在running,导致新提交的任务一直处于WAITING状态而报错。

将running的直接kill掉,提交的任务就能正常执行了。

阅读全文
0 0
原创粉丝点击