Hive1.2.1本地、远程模式安装配置及常见错误

来源:互联网 发布:淘宝买家提取器 编辑:程序博客网 时间:2024/06/05 05:23

一、运行环境

  • CentOS 6.5 64位

  • 正确安装配置jdk

  • 正确安装配置hadoop

  • 正确安装mysql

二、所需软件

  • apache-hive-1.2.1-bin.tar.gz
    (下载地址:apache-hive-1.2.1-bin.tar.gz)

  • mysql-connector-java-5.1.22-bin.jar
    (下载地址:mysql-connector-java-5.1.22-bin.jar)

三、安装配置

  1. 解压apache-hive-1.2.1-bin.tar.gz及其他
    tar -zxvf apache-hive-1.2.1-bin.tar.gz

    然后把mysql-connector-java-5.1.22-bin.jar移动到apache-hive-1.2.1-bin/lib/文件夹下

  2. 配置环境变量

    vim /etc/profileexport HIVE_HOME=/hadoop/apache-hive-1.2.1-binexport PATH=$PATH:$HIVE_HOME/bin
  3. 修改Hive配置(apache-hive-1.2.1-bin/conf/)

    • hive-config.sh
 #vim /usr/local/apache-hive-1.1.0-bin/bin/hive-config.sh export JAVA_HOME=/usr/java/jdk_1.7.0_71 export HIVE_HOME=/hadoop/apache-hive-1.2.1-bin export HADOOP_HOME=/hadoop/hadoop-2.6.2
  • cp hive-env.sh.template hive-env.sh
  • vim hive-env.sh
HADOOP_HOME=/hadoop/hadoop-2.6.2export HIVE_CONF_DIR=/hadoop/apache-hive-1.2.1-bin/conf
  • hive-site.xml
#cp hive-default.xml.template hive-site.xml#vim hive-site.xml<property>    <name>javax.jdo.option.ConnectionURL</name>    <value>jdbc:mysql://<远程主机IP或本机>:3306/hive</value>    <!-- 本地、远程模式的区别就是在此 -->    <description>JDBC connect string for a JDBC metastore</description></property><property>    <name>javax.jdo.option.ConnectionDriverName</name>    <value>com.mysql.jdbc.Driver</value>    <description>Driver class name for a JDBC metastore</description></property><property>    <name>javax.jdo.option.ConnectionUserName</name>    <value>数据库用户名</value>    <description>Username to use against metastore database</description></property><property>    <name>javax.jdo.option.ConnectionPassword</name>    <value>数据库密码</value>    <description>password to use against metastore database</description></property>#如果不配置下面的部分可能会产生错误.<property>    <name>hive.exec.local.scratchdir</name>    <value>自定义目录</value>    <description>Local scratch space for Hive jobs</description>  </property><property>    <name>hive.downloaded.resources.dir</name>    <value>自定义目录</value>    <description>Temporary local directory for added resources in the remote file system.</description></property><property>    <name>hive.querylog.location</name>    <value>自定义目录</value>    <description>Location of Hive run time structured log file</description></property><property>   <name>hive.server2.logging.operation.log.location</name>    <value>自定义目录/operation_logs</value>    <description>Top level directory where operation logs are stored if logging functionality is enabled</description></property>

4.启动

  • 启动hadoop

    start-dfs.sh
    start-yarn.sh

  • 启动hive

    hive

四、常见错误

Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.propertiesSLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory][ERROR] Terminal initialization failed; falling back to unsupportedjava.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected        at jline.TerminalFactory.create(TerminalFactory.java:101)        at jline.TerminalFactory.get(TerminalFactory.java:158)        at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)        at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)        at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)        at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

原因是hadoop目录下存在老版本jline:

/hadoop-2.6.2/share/hadoop/yarn/lib: jline-0.9.94.jar

解决方法是:
将hive下的新版本jline的JAR包拷贝到hadoop该目录(/hadoop-2.6.2/share/hadoop/yarn/lib)下:

cp /hadoop/apache-hive-1.2.1-bin/lib/jline-2.12.jar ./

五、附: 配置MySQL

  • 安装MySQL

    [root@server1 ~]# yum install mysql mysql -server

    一路选择yes就行

  • 添加MySQL服务

    [root@server1 ~]# /sbin/chkconfig –add mysqld

  • 启动MySQL

    [root@server1 ~]# service mysqld start

    Starting mysqld: [ OK ]

  • 用root账户在本地登录MySQL

    [root@server1 ~]# mysql -u root

    出现欢迎页面,进入MySQL monitor.

  • 创建数据库实例hive

    mysql > CREATE DATABASE hive;

  • 创建用户hive

    mysql > CREATE USER ‘hive’ IDENTIFIED BY ‘hive’;

  • 给用户hive赋予相应的访问与读写权限

    mysql > GRANT ALL ON hive.* TO hive@localhost ;

0 0
原创粉丝点击