spark pyspark无法运行
来源:互联网 发布:淘宝企业店铺转天猫 编辑:程序博客网 时间:2024/04/28 00:18
spark 版本 spark-2.2.0-bin-hadoop2.7
java:1.8
在运行 ./bin/pyspark 报错
Traceback (most recent call last): File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module> sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files) File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__ self._jsc = self._jvm.JavaSparkContext(self._conf._jconf) File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__ File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_valuepy4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known at java.net.InetAddress.getLocalHost(InetAddress.java:1466) at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:355) at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:347) at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:347) at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:348) at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:348) at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395) at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.util.Utils$.localHostName(Utils.scala:395) at org.apache.spark.SparkContext.<init>(SparkContext.scala:124) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379) at py4j.Gateway.invoke(Gateway.java:214) at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79) at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68) at py4j.GatewayConnection.run(GatewayConnection.java:207) at java.lang.Thread.run(Thread.java:724)
解决办法:
I had the same problem with Spark and it is related to your Laptop IP.
My solution:
sudo /etc/hosts
below
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
add
127.0.0.1 LAPTOPNAME
your LAPTOPNAME can be found with your Terminal and it is root@LAPTOPNAME (whichever you have set up during your installation)
It will run with Java1.
阅读全文
0 0
- spark pyspark无法运行
- Spark 使用Python在pyspark中运行简单wordcount
- Spark pyspark package
- Spark(1)-初识Pyspark
- pyspark-Spark编程指南
- Spark/pyspark RDD 笛卡尔积
- flume-kafka- spark streaming(pyspark)
- pyspark-Spark Streaming编程指南
- flume-kafka- spark streaming(pyspark)
- [Spark]Django项目中使用Spark(pyspark)
- pyspark principle | python spark 集成原理
- 如何使用PyCharm编写Spark程序(pyspark)
- Spark 2.0 Programming Guide 翻译(PySpark)
- Ipython与spark(pyspark)整合
- pycharm开发spark导入pyspark包
- kafka+spark streaming代码实例(pyspark+python)
- pyspark-Spark SQL, DataFrames and Datasets Guide
- pyspark使用anaconda后spark-submit方法
- 设计模式-简单工厂模式
- 开发中环境配置
- 华为机试训练做题总结(二)
- windows中安装request
- 8.1自定义git
- spark pyspark无法运行
- jQuery的eq函数
- Vue2+VueRouter2+Webpack+Axios 构建项目实战2017重制版(九)再把内容页面渲染出来
- thinkphp3.2.3集成Jpush极光推送
- android 导入项目报错
- 1-100之间不能被7整除的数,并求和,JAVA
- 文件的使用
- Java中的Object-equals()方法
- 设计模式-状态模式