shark on yarn udf的使用

来源:互联网 发布:人物杂志 知乎 编辑:程序博客网 时间:2024/05/02 01:28

shark udf好称跟hive完全兼容,今天尝试了一下,尚未遇到问题。
参考资料[1]中是基于hadoop 1的,这里要在基于hadoop2的解决同样的问题。
问题
topk的经典问题,举例如下
源数据

100 10200 12300 33100 4100 8200 20300 31300 3400 4200 2

要的top2如下

100     10100     8200     20200     12300     33300     31400     4

解决方法
1. udf

2. distribute by and sort by

udf编写

源码

package com.example.hive.udf;import org.apache.hadoop.hive.ql.exec.UDF;public final class Rank extends UDF{        private int counter;        private String last_key;        public int evaluate(final String key){                if (!key.equalsIgnoreCase(this.last_key)) {                        this.counter = 1;                        this.last_key = key;                }                return this.counter++;        }}
编写Makefile,方便一个make命令编译生成jar问题,这里没有用流行的mvn,感觉用Makefile更简介些

UDF_CLASSPATH := $(addprefix ${HADOOP_HOME}/share/hadoop/mapreduce/,hadoop-mapreduce-client-core-2.2.0.jar \                         hadoop-mapreduce-client-common-2.2.0.jar \                         hadoop-mapreduce-client-jobclient-2.2.0.jar)UDF_CLASSPATH += $(addprefix ${HADOOP_HOME}/share/hadoop/common/,hadoop-common-2.2.0.jar)UDF_CLASSPATH += $(addprefix ${HIVE_HOME}/lib/,hive-exec-0.11.0.jar)UDF_CLASSPATH := $(shell echo ${UDF_CLASSPATH} | tr -s ' ' ':')JAR := jarJAVAC := javacCLASSDIR := class__mkdir := $(shell for i in ${CLASSDIR}; do [ -d $$i ] || mkdir -p $$i; done)SRC := $(wildcard *.java)TARGET := $(patsubst %.java,%.jar,${SRC}).PHONY: all clean all: ${TARGET}${TARGET}:%.jar:%.java        ${JAVAC} -cp ${UDF_CLASSPATH} -d ${CLASSDIR} $<        ${JAR} -cf $@ -C ${CLASSDIR} .clean:        ${RM} -r ${TARGET} ${CLASSDIR}

make生成Rank.jar,启动shark,依次执行下面命令

add jar Rank.jar;create temporary function myrank as "com.example.hive.udf.Rank";select key,val from (select key,val,myrank(key) r from (select key,val from rank distribute by key sort by key, val desc) t1) t2 where t2.r < 3;OK100     10100     8200     20200     12300     33300     31400     4Time taken: 1.093 seconds

可见shark的内存计算还是比较快的。


参考资料
[1] http://www.cnblogs.com/Torstan/p/3423859.html

0 0