解决spark-shell执行官方sparkstreaming 报 only one sparkcontext maybe running in this jvm的错误

来源:互联网 发布:倩女幽魂手游 mac登陆 编辑:程序博客网 时间:2024/05/17 22:38

解决:

不执行蓝字,更改红字为 

val ssc = new StreamingContext(sc, Seconds(1))

=====================================================================

import org.apache.spark._

import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._ // not necessary since Spark 1.3


// Create a local StreamingContext with two working thread and batch interval of 1 second.
// The master requires 2 cores to prevent from a starvation scenario.


val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

val ssc = new StreamingContext(conf, Seconds(1))

======================================================================

红字之后报了only one sparkcontext maybe running in this jvm 的错误,是因为spark-shell已经生成了一个 sparkcontext 为sc的原因。

直接使用即可,无须重新生成sc。


0 0
原创粉丝点击