spark Reparation和Coalesce 区别

来源:互联网 发布:套路网络用语 编辑:程序博客网 时间:2024/06/07 17:42
val conf = new SparkConf().setMaster("local[*]").setAppName("colseTest")val sc = new SparkContext(conf)val rdd1 = sc.parallelize(Seq("1", "2", "3"))//如果重分区的数目大于原来的分区数,那么必须指定shuffle参数为tru//从源码可以看出,repartition()方法就是coalesce()方法shuffle为true的情况val rdd2 = rdd1.coalesce(8, true)val rdd3 = rdd1.repartition(8)println("RDD2:" + rdd2.partitions.size)println("RDD3:" + rdd3.partitions.size)
打印结果:
RDD2:8RDD3:8
另外:
coalesce 在spark SQL function中有类似 oracle中nvl2点作用,如下解释
* For example, `coalesce(a, b, c)` will return a if a is not null,* or b if a is null and b is not null, or c if both a and b are null but c is not null.
coalesce(null,null,4) 结果返回4
val df = Seq(  ("1", "ll"),  ("2", "ll"),  (null, "lk")).toDF("id", "name")
res.select(coalesce($"id", lit(9))).show() //动态SQL表达res.selectExpr("coalesce(id,'9')").show()  //静态SQL表达



0 0