spark pyspark无法运行

来源:互联网 发布:淘宝企业店铺转天猫 编辑:程序博客网 时间:2024/04/28 00:18

spark 版本 spark-2.2.0-bin-hadoop2.7

java:1.8

在运行 ./bin/pyspark 报错

Traceback (most recent call last):  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module>    sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__    self._jsc = self._jvm.JavaSparkContext(self._conf._jconf)  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_valuepy4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known    at java.net.InetAddress.getLocalHost(InetAddress.java:1466)    at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:355)    at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:347)    at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:347)    at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:348)    at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:348)    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)    at scala.Option.getOrElse(Option.scala:120)    at org.apache.spark.util.Utils$.localHostName(Utils.scala:395)    at org.apache.spark.SparkContext.<init>(SparkContext.scala:124)    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)    at py4j.Gateway.invoke(Gateway.java:214)    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)    at py4j.GatewayConnection.run(GatewayConnection.java:207)    at java.lang.Thread.run(Thread.java:724)

解决办法:

I had the same problem with Spark and it is related to your Laptop IP.

My solution:

sudo /etc/hosts

below

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4

add

127.0.0.1 LAPTOPNAME

your LAPTOPNAME can be found with your Terminal and it is root@LAPTOPNAME (whichever you have set up during your installation)

It will run with Java1.


原创粉丝点击