hive安装及配置

来源:互联网 发布:爵府第一标尺源码 编辑:程序博客网 时间:2024/05/20 13:09

参考hive教程:http://www.yiibai.com/hive/

参考文档:

http://blog.csdn.net/it_taojingzhan/article/details/51789739

http://blog.csdn.net/blackenn/article/details/52234420

http://blog.csdn.net/wtq1993/article/details/53088968

 

hive下载:

http://mirror.bit.edu.cn/apache/hive/

http://apache.communilink.net/hive/hive-2.0.0/apache-hive-2.0.0-bin.tar.gz

http://ftp.nchu.edu.tw/Unix/Database/MySQL/Downloads/Connector-J/mysql-connector-java-5.1.36.tar.gz


 

1.1.  安装配置

1.1.1. 安装环境

CentOS6.5

mysql-server-5.1.71.tar.gz

apache-hive-2.1.1-bin.tar.gz

hadoop-2.6.2.tar.gz

jdk-8u65-linux-x64.tar.gz

 

1.1.2. Mysql安装

1、通过yum安装mysql: mysql-server-5.1.71.tar.gz

[root@localhost65 local]# yum -y install mysql*

[root@localhost65 local]#/usr/bin/mysqladmin -u root password 'root'修改mysql密码

 

2、登录mysql,mysql默认账号root,密码为空

[root@localhost65 local]# mysql -uroot -p

Enter password:

Welcome to the MySQL monitor.  Commands end with ; or \g.

Your MySQL connection id is 91

Server version: 5.1.71 Source distribution

 

Copyright (c) 2000, 2013, Oracle and/or itsaffiliates. All rights reserved.

 

Oracle is a registered trademark of OracleCorporation and/or its

affiliates. Other names may be trademarksof their respective

owners.

 

Type 'help;' or '\h' for help. Type '\c' toclear the current input statement.

 

mysql>

 

3、创建数据库

mysql> create database hive;

 

4、创建用户并授权

创建用户并授权,允许hive用户通过localhost登录,密码为hive.这里最好写上服务的ip地址。

mysql> CREATE USER 'hive'@'localhost' IDENTIFIED BY"hive";

mysql> grantall privileges on *.* to hive@localhost identified by 'hive

5、删除用户

 mysql>DeleteFROM user Where User='test' and Host='localhost';

 mysql>flush privileges;

 mysql>drop database testDB; //删除用户的数据库

删除账户及权限:>drop user 用户名@'%';

        >drop user 用户名@ localhost; 

6、修改指定用户密码

  mysql>update mysql.user setpassword=password('新密码') where User="test" and Host="localhost";

  mysql>flush privileges;

 

 

1.1.3. 下载,解压缩hive

1、Hadoop安装参考以上hadoop-2.6安装。

下载、解压hive-2.1.0到/usr/local

[root@localhost65download]# ls

apache-hive-2.1.1-bin.tar.gz  hadoop-2.6.2.tar.gz        hbase-1.3.1-bin.tar.gz

apache-kylin-1.6.0-bin.tar.gz  hadoop-native-64-2.6.0.tar  jdk-8u65-linux-x64.tar.gz

[root@localhost65download]# tar -zxf apache-hive-2.1.1-bin.tar.gz -C/usr/local/

[root@localhost65download]# cd /usr/local/

[root@localhost65local]# ls

apache-hive-2.1.1-bin  bin  games         hbase        include      lib   libexec  share  VMwareTools-9.6.1-1378637.tar.gz

apache-kylin-1.6.0-bin  etc hadoop-2.6.2  hbase-1.3.1  jdk1.8.0_65 lib64  sbin     src   vmware-tools-distrib

[root@localhost65local]#

 

 

1.1.4. 设置Hive环境变量

编辑/etc/profile文件, 在其中添加以下内容:

[root@localhost65local]# cd apache-hive-2.1.1-bin/

[root@localhost65apache-hive-2.1.1-bin]# pwd

/usr/local/apache-hive-2.1.1-bin

[root@localhost65apache-hive-2.1.1-bin]# vim /etc/profile

。。。。。。。。。。。。。。。。。。。。

exportJAVA_HOME=/usr/local/jdk1.8.0_65

exportCLASSPATH=.:$JAVA_HOME/lib

exportJRE_HOME=/usr/local/jdk1.8.0_65/jre

 

exportHADOOP_HOME=/usr/local/hadoop-2.6.2

exportHBASE_HOME=/usr/local/hbase-1.3.1

 

exportHIVE_HOME=/usr/local/apache-hive-2.1.1-bin

exportPATH=.:$HIVE_HOME/bin:$PATH

 

[root@localhost65apache-hive-2.1.1-bin]#

使环境变量生效:

[root@localhost65apache-hive-2.1.1-bin]# source /etc/profile

[root@localhost65apache-hive-2.1.1-bin]#

 

 

1.1.5. 拷贝配置文件

在运行 Hive 之前需要使用以下命令修改配置文件:

[root@localhost65~]#

[root@localhost65~]# cd /usr/local/apache-hive-2.1.1-bin/conf/

[root@localhost65conf]# cphive-env.sh.template hive-env.sh

[root@localhost65conf]# cp hive-default.xml.templatehive-site.xml

[root@localhost65conf]# cphive-log4j2.properties.template hive-log4j2.properties

[root@localhost65conf]# cphive-exec-log4j2.properties.template hive-exec-log4j2.properties

[root@localhost65conf]# ls

beeline-log4j2.properties.template  hive-exec-log4j2.properties.template  llap-cli-log4j2.properties.template

hive-default.xml.template           hive-log4j2.properties                llap-daemon-log4j2.properties.template

hive-env.sh                        hive-log4j2.properties.template      parquet-logging.properties

hive-env.sh.template                hive-site.xml

hive-exec-log4j2.properties         ivysettings.xml

[root@localhost65conf]#

 

 

1.1.6. 修改hive-env.sh

因为 Hive 使用了 Hadoop, 需要在 hive-env.sh 文件中指定 Hadoop 安装路径:

[root@localhost65conf]#

[root@localhost65conf]# vim hive-env.sh

#   else

#     export HADOOP_OPTS="$HADOOP_OPTS-XX:NewRatio=12 -Xms10m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:-UseGCOverheadLimit"

#   fi

# fi

 

# The heap size ofthe jvm stared by hive shell script can be controlled via:

#

# exportHADOOP_HEAPSIZE=1024

#

# Larger heap sizemay be required when running queries over large number of files or partitions.

# By default hiveshell scripts use a heap size of 256 (MB). Larger heap size would also be

# appropriate forhive server (hwi etc).

 

#Hadoop安装路径

# Set HADOOP_HOME topoint to a specific hadoop install directory

#HADOOP_HOME=${bin}/../../hadoop

HADOOP_HOME=/usr/local/hadoop-2.6.2

 

#Hive安装路径

# Hive ConfigurationDirectory can be controlled by:

exportHIVE_CONF_DIR=/usr/local/apache-hive-2.1.1-bin/conf

 

# Folder containingextra ibraries required for hive compilation/execution can be controlled by:

# exportHIVE_AUX_JARS_PATH=

"hive-env.sh"55L, 2449C

[root@localhost65conf]#

 

 

1.1.7. 修改hive-site.xml

修改hive-site.xml文件中内容,添加如下内容:

[root@localhost65conf]# vim hive-site.xml

<?xmlversion="1.0" encoding="UTF-8"standalone="no"?>

<?xml-stylesheettype="text/xsl" href="configuration.xsl"?>

<configuration>

<property>

   <name>hive.exec.scratchdir</name>

   <value>/tmp/hive-${user.name}</value>

    <description>HDFS root scratch dirfor Hive jobs which gets created with write all (733) permission. For eachconnecting user, an HDFS scratch dir:${hive.exec.scratchdir}/&lt;username&gt; is created, with${hive.scratch.dir.permission}.</description>

  </property>

  <property>

   <name>hive.exec.local.scratchdir</name>

   <value>/tmp/${user.name}</value>

    <description>Local scratch space forHive jobs</description>

  </property>

  <property>

   <name>hive.downloaded.resources.dir</name>

    <value>/tmp/hive/resources</value>

    <description>Temporary localdirectory for added resources in the remote file system.</description>

  </property>

<property>

   <name>hive.querylog.location</name>

   <value>/tmp/${user.name}</value>

    <description>Location of Hive runtime structured log file</description>

  </property>

<property>

   <name>hive.server2.logging.operation.log.location</name>

   <value>/tmp/${user.name}/operation_logs</value>

    <description>Top level directorywhere operation logs are stored if logging functionality isenabled</description>

  </property>

</configuration>

 

 

1.1.8. 配置Hive数据库链接

默认情况下, Hive的元数据保存在了内嵌的 derby 数据库里, 但一般情况下生产环境使用 MySQL 来存放 Hive 元数据。

1、将 mysql-connector-java-5.1.40-bin.jar 放入 $HIVE_HOME/lib 下。

2、在hive-site.xml 中配置 MySQL 数据库连接信息。

[root@localhost65conf]# vim hive-site.xml

<property>

 <name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&amp;characterEncoding=UTF-8&amp;useSSL=false</value>

</property>

<property>

 <name>javax.jdo.option.ConnectionDriverName</name>

 <value>com.mysql.jdbc.Driver</value>

</property>

<property>

 <name>javax.jdo.option.ConnectionUserName</name>

 <value>hive</value>

</property>

<property>

 <name>javax.jdo.option.ConnectionPassword</name>

 <value>hive</value>

</property>

[root@localhost65conf]#

 

 

1.1.9. 为Hive创建HDFS目录

在hive中创建表之前需要使用以下 HDFS 命令创建 /tmp 和/user/hive/warehouse (hive-site.xml 配置文件中属性项hive.metastore.warehouse.dir 的默认值) 目录并给它们赋写权限。

start-dfs.sh

hdfs dfs -mkdir /tmp

hdfs dfs -mkdir -p/usr/hive/warehouse

hdfs dfs -chmod g+w/tmp

hdfs dfs -chmod g+w/usr/hive/warehouse

 

[root@localhost65~]#

[root@localhost65~]# /usr/local/hadoop-2.6.2/sbin/start-all.sh

This script isDeprecated. Instead use start-dfs.sh and start-yarn.sh

Starting namenodeson [localhost65]

localhost65:starting namenode, logging to/usr/local/hadoop-2.6.2/logs/hadoop-root-namenode-localhost65.out

localhost: startingdatanode, logging to/usr/local/hadoop-2.6.2/logs/hadoop-root-datanode-localhost65.out

Starting secondarynamenodes [localhost65]

localhost65:starting secondarynamenode, logging to/usr/local/hadoop-2.6.2/logs/hadoop-root-secondarynamenode-localhost65.out

starting yarndaemons

startingresourcemanager, logging to/usr/local/hadoop-2.6.2/logs/yarn-root-resourcemanager-localhost65.out

localhost: startingnodemanager, logging to/usr/local/hadoop-2.6.2/logs/yarn-root-nodemanager-localhost65.out

[root@localhost65~]#

[root@localhost65~]# hdfs dfs -mkdir /tmp

[root@localhost65~]# hdfs dfs -mkdir -p /usr/hive/warehouse

[root@localhost65~]# hdfs dfs -chmod g+w /tmp

[root@localhost65~]# hdfs dfs -chmod g+w /usr/hive/warehouse

[root@localhost65~]#

 

1.1.10.    运行Hive

在命令行运行 hive 命令时必须保证 HDFS 已经启动。可以使用 start-dfs.sh 来启动 HDFS。

从 Hive 2.1 版本开始, 我们需要先运行 schematool 命令来执行初始化操作。

$ schematool -dbType mysql -initSchema

运行结果:

[root@localhost65 apache-hive-2.1.1-bin]# ls

bin  conf  examples hcatalog  jdbc  lib LICENSE  NOTICE  README.txt RELEASE_NOTES.txt  scripts

[root@localhost65 apache-hive-2.1.1-bin]# pwd

/usr/local/apache-hive-2.1.1-bin

[root@localhost65 apache-hive-2.1.1-bin]#

[root@localhost65 apache-hive-2.1.1-bin]# bin/schematool -dbType mysql-initSchema

[root@localhost65 apache-hive-2.1.1-bin]# bin/schematool -dbTypemysql -initSchema

Metastore connection URL:    jdbc:mysql://192.168.3.65:3306/hive?createDatabaseIfNotExist=true

Metastore Connection Driver :        com.mysql.jdbc.Driver

Metastore connection User:   hive

Starting metastore schema initialization to 2.1.0

Initialization script hive-schema-2.1.0.mysql.sql

Initialization script completed

schemaTool completed

#以上信息表示hive初始化成功!

[root@localhost65 apache-hive-2.1.1-bin]#

 

 

要使用 Hive CLI(Hive command line interface), 可以在终端输入以下命令:

$ hive

启动信息如下:

[root@localhost65 bin]#

[root@localhost65 bin]# hive

Logging initialized using configuration infile:/usr/local/apache-hive-2.1.1-bin/conf/hive-log4j2.properties Async: true

Hive-on-MR is deprecated in Hive 2 and may not be available in thefuture versions. Consider using a different execution engine (i.e. spark, tez)or using Hive 1.X releases.

hive> show tables;

OK

kylin_cal_dt

kylin_category_groupings

kylin_intermediate_kylin_sales_cube_desc_20120101000000_20170831000000

kylin_sales

student

Time taken: 1.131 seconds, Fetched: 5 row(s)

hive>

以上表示hive安装成功!

 

 

1.2. hive错误

1.2.1.  hive2.3初始化mysql不起作用

[root@localhost65bin]# schematool -initSchema -dbType mysql--verbose

Metastore connectionURL:   jdbc:derby:;databaseName=metastore_db;create=true

Metastore ConnectionDriver :   org.apache.derby.jdbc.EmbeddedDriver

Metastore connectionUser:  APP

Starting metastoreschema initialization to 2.3.0

Initialization scripthive-schema-2.3.0.mysql.sql

Connecting tojdbc:derby:;databaseName=metastore_db;create=true

Connected to: ApacheDerby (version 10.10.2.0 - (1582446))

Driver: Apache DerbyEmbedded JDBC Driver (version 10.10.2.0 - (1582446))

Transaction isolation:TRANSACTION_READ_COMMITTED

0: jdbc:derby:>!autocommit on

Autocommit status: true

0: jdbc:derby:>/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */

Error: Syntax error:Encountered "<EOF>" at line 1, column 64.(state=42X01,code=30000)

 

Closing: 0:jdbc:derby:;databaseName=metastore_db;create=true

org.apache.hadoop.hive.metastore.HiveMetaException:Schema initialization FAILED! Metastore state would be inconsistent !!

Underlying cause:java.io.IOException : Schema script failed, errorcode 2

org.apache.hadoop.hive.metastore.HiveMetaException:Schema initialization FAILED! Metastore state would be inconsistent !!

    at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:590)

    atorg.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:563)

    atorg.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1145)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

    atorg.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by:java.io.IOException: Schema script failed, errorcode 2

    atorg.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:980)

    at org.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:959)

    atorg.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:586)

    ... 8 more

*** schemaTool failed***

[root@localhost65bin]#

hive2.3初始化mysql不起作用,主要原因是安装包自身有问题。删除该解压文件,使用其他版本。

 

1.2.2.  org.apache.hadoop.hive.ql.exec.DDLTask

 

hive> show databases;  

FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Unexpected exception caught.  

NestedThrowables:  

java.lang.reflect.InvocationTargetException  

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask  

解决办法

将hadoop_home路径下的build目录移除,然后再启动hive

 

 

1.2.3.  Duplicate key name'PCS_STATS_IDX

No rows affected(0.004 seconds)

0:jdbc:mysql://192.168.3.65:3306/hive> CREATE INDEX PCS_STATS_IDX ONPART_COL_STATS (DB_NAME,TABLE_NAME,COLUMN_NAME,PARTITION_NAME) USING BTREE

Error: Duplicate key name'PCS_STATS_IDX' (state=42000,code=1061)

 

Closing: 0:jdbc:mysql://192.168.3.65:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8&useSSL=false

org.apache.hadoop.hive.metastore.HiveMetaException:Schema initialization FAILED! Metastore state would be inconsistent !!

Underlying cause:java.io.IOException : Schema script failed, errorcode 2

org.apache.hadoop.hive.metastore.HiveMetaException:Schema initialization FAILED! Metastore state would be inconsistent !!

    atorg.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:291)

    atorg.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:264)

    at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:505)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    atorg.apache.hadoop.util.RunJar.run(RunJar.java:221)

    atorg.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by:java.io.IOException: Schema script failed, errorcode 2

    atorg.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:390)

    atorg.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:347)

    atorg.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:287)

    ... 8 more

*** schemaTool failed***

[root@localhost65apache-hive-2.1.1-bin]#

 

解决办法1

“Error: Duplicate keyname 'PCS_STATS_IDX'”  

这是由于之前曾经格式化一次,或者有表未导入,mysql中的hive库中有残留的数据,残留的表,将mysql中的hive库删掉重新创建,或者删掉hive中的表;再次格式化

 

解决办法2

将hive 所在 节点上的/usr/local/hive/script/metastore/upgrade/msql/hive-schema-1.2.1000.mysql.sql拷贝到HIVE所连接的数据库,

然后中HIVE库中执行这个脚本就OK 了。

 

 

1.2.4.  Error: Syntax error:Encountered

 

安装好MySQL后,使用初始化命令 schematool -dbType mysql -initSchemaHive初始化时,出现以下异常:Error: Syntax error:Encountered “” at line 1, colume64.(state=42x01,code=30000)

解决办法:

上述截图中,可以看到hive数据库驱动还是derby,因此需要将hive-site.xml中的hive 连接数据库的驱动修改一下。打开hive-site.xml,找到如下区域。

按图片改正即可。需要注意,在hive的lib下,需要有mysql-connector-Java-5.x.x-bin.jar这个文件。还有在这个配置文件中,hiveConnectionDriverName可能不止一个,因此需要删除其余的,保留这一个才行。

 

 

 

1.2.5.  org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

[root@localhost65 apache-hive-1.2.2-bin]#hive

SLF4J: Class pathcontains multiple SLF4J bindings.

SLF4J: Found bindingin[jar:file:/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found bindingin [jar:file:/usr/local/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Seehttp://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual bindingis of type [org.apache.logging.slf4j.Log4jLoggerFactory]

 

Logging initializedusing configuration infile:/usr/local/apache-hive-2.1.1-bin/conf/hive-log4j2.properties Async: true

Exception in thread"main" java.lang.RuntimeException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    atorg.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)

    atorg.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)

    atorg.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)

    atorg.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    atorg.apache.hadoop.util.RunJar.run(RunJar.java:221)

    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    atorg.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)

    atorg.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)

    atorg.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)

    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)

    atorg.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)

    ... 9 more

Caused by:java.lang.RuntimeException: Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)

    atorg.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)

    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)

    atorg.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)

    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)

    ... 14 more

Caused by:java.lang.reflect.InvocationTargetException

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

    atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

    atjava.lang.reflect.Constructor.newInstance(Constructor.java:422)

    atorg.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)

    ... 23 more

Caused by:MetaException(message:Version information not found in metastore. )

    atorg.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7753)

    atorg.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:7731)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    atorg.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)

    atcom.sun.proxy.$Proxy21.verifySchema(Unknown Source)

    atorg.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:565)

    atorg.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:626)

    atorg.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)

    atorg.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)

    atorg.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)

    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)

    atorg.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)

    atorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)

    ... 28 more

[root@localhost65apache-hive-1.2.2-bin]#

[root@localhost65apache-hive-1.2.2-bin]#

[root@localhost65apache-hive-1.2.2-bin]# ls

bin  conf examples  hcatalog  lib LICENSE  NOTICE  README.txt RELEASE_NOTES.txt  scripts

[root@localhost65apache-hive-1.2.2-bin]#

原因分析:

    是由于没有初始化数据库导致,执行名称初始化数据库即可schematool -dbType mysql -initSchema

 

 

1.2.6.  SLF4J: Found binding in log4j-slf4j-impl-2.4.1.jar

[root@localhost65apache-hive-2.1.1-bin]# bin/hive

SLF4J: Class pathcontains multiple SLF4J bindings.

SLF4J: Found binding in[jar:file:/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found bindingin[jar:file:/usr/local/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Seehttp://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual bindingis of type [org.apache.logging.slf4j.Log4jLoggerFactory]

 

Logging initializedusing configuration in file:/usr/local/apache-hive-2.1.1-bin/conf/hive-log4j2.propertiesAsync: true

Exception in thread"main" java.lang.RuntimeException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)

    at

。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。

The last packet sentsuccessfully to the server was 0 milliseconds ago. The

问题原因:

由于hive中的日志和hadoop中日志包冲突导致。

解决办法:

将hive中的日志包去除即可(不要删除,修改名称即可)。如下;

[root@localhost65apache-hive-2.1.1-bin]# mv/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar.bak

[root@localhost65apache-hive-2.1.1-bin]#

 

1.2.7.  javax.jdo.JDOFatalDataStoreException:Unable to open a test connection

[root@localhost65bin]# hive

 

Logging initializedusing configuration infile:/usr/local/apache-hive-2.1.1-bin/conf/hive-log4j2.properties Async: true

Exception in thread"main" java.lang.RuntimeException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    atorg.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)

    atorg.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)

    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)

    atorg.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)

    atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    atorg.apache.hadoop.util.RunJar.run(RunJar.java:221)

    atorg.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    atorg.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)

    atorg.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)

    atorg.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)

    atorg.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)

    atorg.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)

    ... 9 more

Caused by:java.lang.RuntimeException: Unable to instantiateorg.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)

    atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)

    atorg.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)

    atorg.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)

    atorg.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)

    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)

    ... 14 more

Caused by:java.lang.reflect.InvocationTargetException

    atsun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    atsun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

    atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

    atjava.lang.reflect.Constructor.newInstance(Constructor.java:422)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)

    ... 23 more

Caused by:javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to thegiven database. JDBC url =jdbc:mysql://192.168.3.65:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8&useSSL=false,username = hive. Terminating connection pool (set lazyInit to true if youexpect to start your database after your app). Original Exception: ------

com.mysql.jdbc.exceptions.jdbc4.CommunicationsException:Communications link failure

 

。。。。。。。。。。。。。。。。。。。

    atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    atjava.lang.reflect.Method.invoke(Method.java:497)

    atorg.apache.hadoop.util.RunJar.run(RunJar.java:221)

    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by:com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications linkfailure

 

The last packet sentsuccessfully to the server was 0 milliseconds ago. The driver has not receivedany packets from the server.

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)

    atsun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

    atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

    atjava.lang.reflect.Constructor.newInstance(Constructor.java:422)

    atcom.mysql.jdbc.Util.handleNewInstance(Util.java:406)

    atcom.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1074)

    atcom.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:343)

    atcom.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2137)

    ... 83 more

Caused by:java.net.ConnectException: 拒绝连接

    atjava.net.PlainSocketImpl.socketConnect(Native Method)

    atjava.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)

    atjava.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)

    atjava.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)

    atjava.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)

    at java.net.Socket.connect(Socket.java:589)

    at java.net.Socket.connect(Socket.java:538)

    at java.net.Socket.<init>(Socket.java:434)

    atjava.net.Socket.<init>(Socket.java:244)

    atcom.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:253)

    atcom.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:292)

    ... 84 more

------

。。。。。。。。。。。。。。。

[root@localhost65bin]#

 

错误原因:

    Hive在使用mysql数据库链接时,数据库未启动。

解决办法:

    命令启动mysql数据库。

[root@localhost65bin]#

[root@localhost65bin]# service mysqld restart

停止 mysqld:                                              [确定]

正在启动 mysqld:                                          [确定]

[root@localhost65bin]#

 


原创粉丝点击