Spark算子:RDDAction操作–first/count/reduce/collect/collectAsMap
来源:互联网 发布:java触发器是什么 编辑:程序博客网 时间:2024/05/29 16:47
first
def first(): Tfirst返回RDD中的第一个元素,不排序。
scala> var rdd1 = sc.makeRDD(Array(("A","1"),("B","2"),("C","3")),2)rdd1: org.apache.spark.rdd.RDD[(String, String)] = ParallelCollectionRDD[33] at makeRDD at :21 scala> rdd1.firstres14: (String, String) = (A,1) scala> var rdd1 = sc.makeRDD(Seq(10, 4, 2, 12, 3))rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at makeRDD at :21 scala> rdd1.firstres8: Int = 10
count
def count(): Longcount返回RDD中的元素数量。
scala> var rdd1 = sc.makeRDD(Array(("A","1"),("B","2"),("C","3")),2)rdd1: org.apache.spark.rdd.RDD[(String, String)] = ParallelCollectionRDD[34] at makeRDD at :21 scala> rdd1.countres15: Long = 3
reduce
def reduce(f: (T, T) ⇒ T): T根据映射函数f,对RDD中的元素进行二元计算,返回计算结果。
scala> var rdd1 = sc.makeRDD(1 to 10,2)rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[36] at makeRDD at :21 scala> rdd1.reduce(_ + _)res18: Int = 55 scala> var rdd2 = sc.makeRDD(Array(("A",0),("A",2),("B",1),("B",2),("C",1)))rdd2: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[38] at makeRDD at :21 scala> rdd2.reduce((x,y) => { | (x._1 + y._1,x._2 + y._2) | })res21: (String, Int) = (CBBAA,6)
collect
def collect(): Array[T]
def collect[U: ClassTag](f: PartialFunction[T, U]): RDD[U]
collect用于将一个RDD转换成数组。scala> var rdd1 = sc.makeRDD(1 to 10,2)rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[36] at makeRDD at :21scala> rdd1.collectres23: Array[Int] = Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)<div class="line number10 index9 alt1" style="white-space: pre-wrap; line-height: 20.8px; border-radius: 0px !important; border: 0px !important; bottom: auto !important; float: none !important; height: auto !important; left: auto !important; margin: 0px !important; outline: 0px !important; overflow: visible !important; padding: 0px 1em 0px 0em !important; position: static !important; right: auto !important; top: auto !important; vertical-align: baseline !important; width: auto !important; box-sizing: content-box !important; direction: ltr !important; box-shadow: none !important; background: none rgb(247, 247, 247) !important;"><pre name="code" class="plain" style="font-size: 13px; font-family: Consolas, "Bitstream Vera Sans Mono", "Courier New", Courier, monospace;"><pre name="code" class="plain">scala> val one: PartialFunction[Int, String] = { case 1 => "one"; case _ => "other"}one: PartialFunction[Int,String] = <function1> scala> val data = sc.parallelize(List(2,3,1))data: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[11] at parallelize at <console>:12 scala> data.collect(one).collectres4: Array[String] = Array(other, other, one)
collectAsMap
def collectAsMap(): Map[K, V]
scala> val data = sc.parallelize(List((1, "www"), (1, "iteblog"), (1, "com"), (2, "bbs"), (2, "iteblog"), (2, "com"), (3, "good")))data: org.apache.spark.rdd.RDD[(Int, String)] = ParallelCollectionRDD[26] at parallelize at <console>:12 scala> data.collectAsMapres28: scala.collection.Map[Int,String] = Map(2 -> com, 1 -> com, 3 -> good)
0 0
- Spark算子:RDDAction操作–first/count/reduce/collect/collectAsMap
- Spark算子:RDD行动Action操作(1)–first、count、reduce、collect
- Spark算子:RDD行动Action操作(1)–first、count、reduce、collect
- spark RDD算子(九)之基本的Action操作 first, take, collect, count, countByValue, reduce, aggregate, fold,top
- RDD行动Action操作(1)–first、count、reduce、collect
- 3.4 Spark RDD Action操作1-first、count、lookup、collect
- Spark算子[07]:reduce,reduceByKey,count,countByKey
- spark--actions算子--collect
- spark RDD算子(十)之PairRDD的Action操作countByKey, collectAsMap
- spark--actions算子--reduce
- Spark reduce算子
- spark--actions算子--count
- spark--actions算子--first
- Spark操作算子 转换算子
- Spark编程之基本的RDD算子sparkContext,foreach,foreachPartition, collectAsMap
- RX操作符之算术和聚合操作(averageInteger、min、max、count、sum、contact、reduce、collect)
- Spark函数讲解:collectAsMap
- Spark函数讲解:collectAsMap
- cc2640 IAR 打印log
- android学习笔记 SwipeRefreshLayout 的使用
- commons-lang(time应用),解决simpleDateFormat性能和安全问题
- Java获取本地机器MAC地址
- 智能指针shared_ptr的一些综合
- Spark算子:RDDAction操作–first/count/reduce/collect/collectAsMap
- 今年暑假不AC( 活动安排)
- android系统服务及权限设置
- JS JAVA 判断是否微信浏览器(未测试)
- leetcode:数组之Search in Rotated Sorted Array
- Spark snappy
- 使用Source Insight 查看VS2010项目
- hdu5734 Acperience(数学)
- hdu5745 字符串匹配 多校2.12