sparkrdd自动转换能用pairfun(否则无法用reducebykey,groupbykey)

来源:互联网 发布:金十数据是什么 编辑:程序博客网 时间:2024/05/16 14:48

引入import org.apache.spark.SparkContext._

注意与import org.apache.spark.SparkContext不同,这个是为了用sparkcontex



国外的网站上

 SparkContext provides an implicit conversion from RDD[T] to > PairRDDFunctions[T] to make this transparent to users.>> To import those implicit conversions, use>>     import org.apache.spark.SparkContext._>>> These conversions are automatically imported by Spark Shell, but > you'll have to import them yourself in standalone programs.>>


总结 :写spark程序你永远不知道下一步会卡在哪里!

0 0
原创粉丝点击