spark countByKey用法详解

来源:互联网 发布:mac 10.7.5 dmg 编辑:程序博客网 时间:2024/06/15 22:23

   统计每个key对应的value个数,需要注意的是rdd类型是pairRdd,即键值对的形式的rdd,详细代码如下:

private static void myCountByKey(){
        SparkConf conf=new SparkConf()
        .setMaster("local")
        .setAppName("myCountByKey");
        JavaSparkContext sc=new JavaSparkContext(conf);
        List<Tuple2<String,String>> studentList=Arrays.asList(new Tuple2<String,String>("c1","cai"),new Tuple2<String,String>("c2","niao")
                ,new Tuple2<String,String>("c1","feng"),new Tuple2<String,String>("c2","jin"),new Tuple2<String,String>("c2","niao"));
        JavaPairRDD<String, String> studentRdd= sc.parallelizePairs(studentList);
        Map<String, Object> studentCounts=studentRdd.countByKey();
        for(Map.Entry<String, Object> map:studentCounts.entrySet()){
            System.out.println("key:"+map.getKey()+",values:"+map.getValue());
        }
    }

运行结果:

key:c2,values:3
key:c1,values:2

0 0