java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries!

来源:互联网 发布:汇编语言用什么软件 编辑:程序博客网 时间:2024/06/15 16:32

在Centos7 中 使用 Intellij IDEA 来 Spark 本地模式启动时,报错:

java.net.BindException: 无法指定被请求的地址: Service 'sparkDriver' failed after 16 retries!16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 WARN Utils: Service ‘sparkDriver’ could not bind on port 0. Attempting port 1. 16/06/27 19:36:34 ERROR SparkContext: Error initializing SparkContext. Java.NET.BindException: 无法指定被请求的地址: Service ‘sparkDriver’ failed after 16 retries!

这是因为我的客户端电脑IP改变了,改过来之后就好了。

例如:

val sparkConf = new SparkConf()  .setAppName(jobName)  .set("spark.driver.host", "localhost")  .setMaster("local[4]")
阅读全文
1 0
原创粉丝点击