spring-data-hadoop使用--maven配置详解

来源:互联网 发布:淘宝天猫电商仓储外包 编辑:程序博客网 时间:2024/06/05 00:18

这几天一直在纠结spring-data-hadoop的JAR包,总是运行不成功,现在终于可以运行了,pom内容为:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>winksi.com.cn</groupId><artifactId>hadoopTest</artifactId><version>0.0.1-SNAPSHOT</version><!-- 添加镜像依赖 ,在maven的官方库里是没有CDH相关包的,所以只能配置CDH的官网下载地址来下载jar包--><repositories><repository><id>cloudera</id><url>https://repository.cloudera.com/artifactory/cloudera-repos/</url></repository><repository><id>spring-repo</id><url>http://repo.springsource.org/libs-milestone</url></repository></repositories><properties><java.version>1.6</java.version><project.build.sourceEncoding>UTF-8</project.build.sourceEncoding><project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding><!-- Spring --><spring-framework.version>3.2.3.RELEASE</spring-framework.version><!-- Logging --><logback.version>1.0.13</logback.version><slf4j.version>1.7.5</slf4j.version><!-- Test --><junit.version>4.11</junit.version><!-- hadoop --><CDH.version>2.0.0-cdh4.5.0</CDH.version><hadoop.core.version>2.0.0-mr1-cdh4.5.0</hadoop.core.version><groovy.version>1.8.5</groovy.version></properties><dependencies><!-- Spring and Transactions --><dependency><groupId>org.springframework</groupId><artifactId>spring-context</artifactId><version>${spring-framework.version}</version></dependency><dependency><groupId>org.springframework</groupId><artifactId>spring-tx</artifactId><version>${spring-framework.version}</version></dependency><!-- Spring-jdbc --><dependency><groupId>org.springframework</groupId><artifactId>spring-jdbc</artifactId><version>${spring-framework.version}</version></dependency><!-- Logging with SLF4J & LogBack --><dependency><groupId>org.slf4j</groupId><artifactId>slf4j-api</artifactId><version>${slf4j.version}</version><scope>compile</scope></dependency><dependency><groupId>ch.qos.logback</groupId><artifactId>logback-classic</artifactId><version>${logback.version}</version><scope>runtime</scope></dependency><!-- Test Artifacts --><dependency><groupId>org.springframework</groupId><artifactId>spring-test</artifactId><version>${spring-framework.version}</version><scope>test</scope></dependency><dependency><groupId>junit</groupId><artifactId>junit</artifactId><version>${junit.version}</version><scope>test</scope></dependency><dependency><groupId>com.alibaba</groupId><artifactId>fastjson</artifactId><version>1.1.37</version></dependency><span style="color:#FF0000;"><!-- hadoop --><dependency><groupId>org.springframework.data</groupId><artifactId>spring-data-hadoop</artifactId><version>1.0.2.RELEASE-cdh4</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-hdfs</artifactId><version>${CDH.version}</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-core</artifactId><version>${hadoop.core.version}</version></dependency><dependency><groupId>org.codehaus.groovy</groupId><artifactId>groovy</artifactId><version>${groovy.version}</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>${CDH.version}</version></dependency><dependency><groupId>org.apache.hive</groupId><artifactId>hive-service</artifactId><version>0.10.0-cdh4.5.0</version></dependency><dependency><groupId>org.apache.hbase</groupId><artifactId>hbase</artifactId><version>0.94.6-cdh4.5.0</version></dependency></span></dependencies></project>

主要是红色部分的JAR包

spring配置文件为:

<?xml version="1.0" encoding="UTF-8"?><beans:beans xmlns="http://www.springframework.org/schema/hadoop"xmlns:beans="http://www.springframework.org/schema/beans"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xmlns:context="http://www.springframework.org/schema/context"xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd  http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd  http://www.springframework.org/schema/hadoop http://www.springframework.org/schema/hadoop/spring-hadoop.xsd"  default-autowire="byName"><context:property-placeholder location="classpath:config.properties" /><configuration>fs.default.name=${hd.fs}mapred.job.tracker=${hd.mr}</configuration><!-- This will throw a NPE at the end of running the app that should be ignored          To avoid this, run against the stand-alone server, use the command line            hive -hiveconf fs.default.name=hdfs://localhost:9000 -hiveconf mapred.job.tracker=localhost:9001          --><hive-server port="${hive.port}" auto-startup="true"/>     <hive-client-factory host="${hive.host}" port="${hive.port}"/><hive-template id="hiveTemplate"/></beans:beans>

参数配置文件为:

hd.fs=hdfs://172.16.1.50:8020
hd.mr=172.16.1.50:8021
hd.zk.port=2181
hd.zk.host=172.16.1.50
hive.host=172.16.1.50
hive.port=10000


程序中调用方式:

        ApplicationContext ac = new ClassPathXmlApplicationContext("applicationContext.xml");
        HiveTemplate hiveTemplate = (HiveTemplate) ac.getBean("hiveTemplate");
        List list = hiveTemplate.query("show tables;");
        System.out.println(JSON.toJSONString(list));

调用成功


0 2
原创粉丝点击