Spark只启动了Master,Worker没启动的

来源:互联网 发布:马踏棋盘c语言算法 编辑:程序博客网 时间:2024/05/29 10:10

有四台虚拟机,1台master,3台slave,master启动spark正常,查看slave中logs文件,发现报如下错误

Spark Command: /usr/jdk1.8/bin/java -cp /usr/hadoop/spark-2.0.2/conf/:/usr/hadoop/spark-2.0.2/jars/*:/usr/hadoop/hadoop2.7.3/etc/hadoop/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077========================================17/01/16 16:12:22 INFO Worker: Started daemon with process name: 3001@master17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for TERM17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for HUP17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for INT17/01/16 16:12:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable17/01/16 16:12:22 INFO SecurityManager: Changing view acls to: hadoop17/01/16 16:12:22 INFO SecurityManager: Changing modify acls to: hadoop17/01/16 16:12:22 INFO SecurityManager: Changing view acls groups to:17/01/16 16:12:22 INFO SecurityManager: Changing modify acls groups to:17/01/16 16:12:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service 'sparkWorker' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkWorker' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.        at sun.nio.ch.Net.bind0(Native Method)        at sun.nio.ch.Net.bind(Net.java:433)        at sun.nio.ch.Net.bind(Net.java:425)        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)

spark的配置文件spark-env.sh如下

export JAVA_HOME=/usr/jdk1.8export SCALA_HOME=/usr/hadoop/scala-2.11.4export HADOOP_HOME=/usr/hadoop/hadoop2.7.3export HADOOP_CONF_DIR=/usr/hadoop/hadoop2.7.3/etc/hadoopexport SPARK_MASTER_IP=192.168.9.200export SPARK_WORKER_MEMORY=1gexport SPARK_WORKER_CORES=1export SPARK_HOME=/usr/hadoop/spark-2.0.2
在后面加上SPARK_LOCAL_IP之后三个Worker正常启动

export SPARK_LOCAL_IP=127.0.0.1



0 0