单机运行Spark Shell遇到的一个低级错误

来源:互联网 发布:浙江省政府数据平台 编辑:程序博客网 时间:2024/05/20 14:17

bin/spark-shell

下载spark-2.1.0-bin-hadoop2.7.tgz,解压缩直接进入spark根目录,然后运行bin/spark-shell即可进入。
但是今天遇到了一个低级错误:
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.

[root@sk1 spark-2.1.0-bin-hadoop2.7]# bin/spark-shell Using Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).17/04/07 22:33:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.17/04/07 22:33:38 ERROR SparkContext: Error initializing SparkContext.java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.    at sun.nio.ch.Net.bind0(Native Method)    at sun.nio.ch.Net.bind(Net.java:433)    at sun.nio.ch.Net.bind(Net.java:425)    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)    at java.lang.Thread.run(Thread.java:745)java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.  at sun.nio.ch.Net.bind0(Native Method)  at sun.nio.ch.Net.bind(Net.java:433)  at sun.nio.ch.Net.bind(Net.java:425)  at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)  at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)  at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)  at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)  at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506)  at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491)  at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)  at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)  at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)  at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455)  at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)  at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)  at java.lang.Thread.run(Thread.java:745)<console>:14: error: not found: value spark       import spark.implicits._              ^<console>:14: error: not found: value spark       import spark.sql              ^Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0      /_/Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)Type in expressions to have them evaluated.Type :help for more information.scala>

问题原因

[root@sk1 ~]# ifconfigens32: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500        inet 192.168.11.138  netmask 255.255.255.0  broadcast 192.168.11.255        inet6 fe80::a8bd:a097:8ca9:d22a  prefixlen 64  scopeid 0x20<link>        ether 00:0c:29:c3:3e:9a  txqueuelen 1000  (Ethernet)        RX packets 273939  bytes 395373188 (377.0 MiB)        RX errors 0  dropped 0  overruns 0  frame 0        TX packets 16657  bytes 2472671 (2.3 MiB)        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536        inet 127.0.0.1  netmask 255.0.0.0        inet6 ::1  prefixlen 128  scopeid 0x10<host>        loop  txqueuelen 1  (Local Loopback)        RX packets 276  bytes 23980 (23.4 KiB)        RX errors 0  dropped 0  overruns 0  frame 0        TX packets 276  bytes 23980 (23.4 KiB)        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

本地IP是192.168.11.138

[root@sk1 ~]# cat /etc/hosts127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4::1         localhost localhost.localdomain localhost6 localhost6.localdomain6192.168.1.138   sk1

很显然,IP配置错了。改正即可

[root@sk1 ~]# vi /etc/hosts127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4::1         localhost localhost.localdomain localhost6 localhost6.localdomain6192.168.11.138  sk1

重新进入

[root@sk1 spark-2.1.0-bin-hadoop2.7]# bin/spark-shell Using Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).17/04/07 22:41:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable17/04/07 22:41:32 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.017/04/07 22:41:33 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException17/04/07 22:41:34 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectExceptionSpark context Web UI available at http://192.168.11.138:4040Spark context available as 'sc' (master = local[*], app id = local-1491619281633).Spark session available as 'spark'.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0      /_/Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)Type in expressions to have them evaluated.Type :help for more information.scala> 
0 2
原创粉丝点击