java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected

来源:互联网 发布:java 正则表达式 空格 编辑:程序博客网 时间:2024/06/03 06:44
环境:
Linux: 
    Red Hat Enterprise Linux Server release 6.0 (Santiago)
Java: 
    java version "1.7.0_60"
    Java(TM) SE Runtime Environment (build 1.7.0_60-b19)
    Java HotSpot(TM) 64-Bit Server VM (build 24.60-b09, mixed mode)
Hadoop: 
    hadoop-2.6.0
Hive:
    apache-hive-1.2.0-bin


前提条件:
    gpmaster启动metastore服务:
[hadoop@gpmaster ~]$ nohup hive --service metastore > metastore.log 2>&1 &
    [1] 3937
[hadoop@gpmaster ~]$ netstat -nlt | grep 9083
    tcp        0      0 0.0.0.0:9083                0.0.0.0:*                   LISTEN 
可以看到metastore服务已经启动。


测试: 
    Hive客户端gpseg(192.168.1.129)通过metastore去访问gpmaster(192.168.1.128)节点的Hive元数据


结果:
[hadoop@gpseg apache-hive-1.2.0-bin]$ bin/hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]


Logging initialized using configuration in jar:file:/home/hadoop/apache-hive-1.2.0-bin/lib/hive-common-1.2.0.jar!/hive-log4j.properties
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)


Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.console.ConsoleReader.<init>(ConsoleReader.java:230)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

解决办法:

    首先查看Hadoop和Hive的jline版本:
    [hadoop@gpseg ~]$ find ./ -name jline*
    ./hadoop-2.6.0/share/hadoop/yarn/lib/jline-0.9.94.jar
    ./hadoop-2.6.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/jline-0.9.94.jar
    ./hadoop-2.6.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/jline-0.9.94.jar
    ./apache-hive-1.2.0-bin/lib/jline-2.12.jar
    我们发现Hive的jline和Hadoop的不一致,所以需要将Hive的jline拷贝到Hadoop目录下面:
    [hadoop@gpseg ~]$ cp apache-hive-1.2.0-bin/lib/jline-2.12.jar hadoop-2.6.0/share/hadoop/yarn/lib/
[hadoop@gpseg ~]$ cd apache-hive-1.2.0-bin/
[hadoop@gpseg apache-hive-1.2.0-bin]$ bin/hive
Logging initialized using configuration in jar:file:/home/hadoop/apache-hive-1.2.0-bin/lib/hive-common-1.2.0.jar!/hive-log4j.properties
hive (default)> show databases;
OK
database_name
default
hbase
hive
spark
Time taken: 5.399 seconds, Fetched: 4 row(s)
hive (default)> 
0 0