Spark Q&A : Only one SparkContext may be running in this JVM

来源:互联网 发布:珠峰node不加密百度云 编辑:程序博客网 时间:2024/05/30 20:07

Q : Spark报错

17/07/06 15:45:21 ERROR ApplicationMaster: User class threw exception: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:org.apache.spark.SparkContext.<init>(SparkContext.scala:82)XXXXX.XXXXXXXXXX$.main(XXXXX.scala:35)XXXXX.XXXXXXXXXX.main(XXXXX.scala)sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)java.lang.reflect.Method.invoke(Method.java:606)org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)

A : 该报错是因为创建了多个sparkContext, 一般是因为在创建StreamingContext的时候使用了SparkContext而非SparkConf,如下:

val sc = new SparkContext(new SparkConf())val ssc = new StreamingContext(sc,Minutes(5))
阅读全文
0 0
原创粉丝点击