hive 元数据 mysql-2
来源:互联网 发布:电脑网络钓鱼游戏 编辑:程序博客网 时间:2024/04/30 12:24
[root@name01 hdfs]# cd /data/hadoop/tmp/[root@name01 tmp]# rm -rf *[root@name01 tmp]# cd /data/hadoop/name/[root@name01 name]# rm -rf *[root@name01 name]# cd /data/hadoop/data/[root@name01 data]# rm -rf *[root@name01 data]# cd /data/hadoop/hdfs/[root@name01 hdfs]# rm -rf *[root@name01 hdfs]# jps2489 Jps[root@name01 hdfs]# hadoop namenode -formatDEPRECATED: Use of this script to execute hdfs command is deprecated.Instead use the hdfs command for it.16/06/10 16:56:46 INFO namenode.NameNode: STARTUP_MSG: /************************************************************STARTUP_MSG: Starting NameNodeSTARTUP_MSG: host = localhost/127.0.0.1STARTUP_MSG: args = [-format]STARTUP_MSG: version = 2.3.0-cdh5.1.0STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.5-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.5-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.3.0-cdh5.1.0-tests.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.3.0-cdh5.1.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.8.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.5-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jline-0.9.94.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.5-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.3.0-cdh5.1.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.3.0-cdh5.1.0.jar:/usr/local/hadoop/contrib/capacity-scheduler/*.jar:/usr/local/hadoop/contrib/capacity-scheduler/*.jarSTARTUP_MSG: build = git://github.sf.cloudera.com/CDH/cdh.git -r 8e266e052e423af592871e2dfe09d54c03f6a0e8; compiled by 'jenkins' on 2014-07-12T13:48ZSTARTUP_MSG: java = 1.8.0_20************************************************************/16/06/10 16:56:47 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]16/06/10 16:56:47 INFO namenode.NameNode: createNameNode [-format]16/06/10 16:56:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableFormatting using clusterid: CID-bc2cc28d-ec72-424d-9fda-eeff873adaef16/06/10 16:56:49 INFO namenode.FSNamesystem: fsLock is fair:true16/06/10 16:56:49 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=100016/06/10 16:56:49 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true16/06/10 16:56:49 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.ms is set to 0 ms.16/06/10 16:56:49 INFO blockmanagement.BlockManager: The block deletion will start around 2016 Jun 10 16:56:4916/06/10 16:56:49 INFO util.GSet: Computing capacity for map BlocksMap16/06/10 16:56:49 INFO util.GSet: VM type = 32-bit16/06/10 16:56:49 INFO util.GSet: 2.0% max memory 966.7 MB = 19.3 MB16/06/10 16:56:49 INFO util.GSet: capacity = 2^22 = 4194304 entries16/06/10 16:56:50 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false16/06/10 16:56:50 INFO blockmanagement.BlockManager: defaultReplication = 116/06/10 16:56:50 INFO blockmanagement.BlockManager: maxReplication = 51216/06/10 16:56:50 INFO blockmanagement.BlockManager: minReplication = 116/06/10 16:56:50 INFO blockmanagement.BlockManager: maxReplicationStreams = 216/06/10 16:56:50 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false16/06/10 16:56:50 INFO blockmanagement.BlockManager: replicationRecheckInterval = 300016/06/10 16:56:50 INFO blockmanagement.BlockManager: encryptDataTransfer = false16/06/10 16:56:50 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 100016/06/10 16:56:50 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)16/06/10 16:56:50 INFO namenode.FSNamesystem: supergroup = supergroup16/06/10 16:56:50 INFO namenode.FSNamesystem: isPermissionEnabled = true16/06/10 16:56:50 INFO namenode.FSNamesystem: HA Enabled: false16/06/10 16:56:50 INFO namenode.FSNamesystem: Append Enabled: true16/06/10 16:56:50 INFO util.GSet: Computing capacity for map INodeMap16/06/10 16:56:50 INFO util.GSet: VM type = 32-bit16/06/10 16:56:50 INFO util.GSet: 1.0% max memory 966.7 MB = 9.7 MB16/06/10 16:56:50 INFO util.GSet: capacity = 2^21 = 2097152 entries16/06/10 16:56:50 INFO namenode.NameNode: Caching file names occuring more than 10 times16/06/10 16:56:50 INFO util.GSet: Computing capacity for map cachedBlocks16/06/10 16:56:50 INFO util.GSet: VM type = 32-bit16/06/10 16:56:50 INFO util.GSet: 0.25% max memory 966.7 MB = 2.4 MB16/06/10 16:56:50 INFO util.GSet: capacity = 2^19 = 524288 entries16/06/10 16:56:50 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.999000012874603316/06/10 16:56:50 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 016/06/10 16:56:50 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 3000016/06/10 16:56:50 INFO namenode.FSNamesystem: Retry cache on namenode is enabled16/06/10 16:56:50 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis16/06/10 16:56:50 INFO util.GSet: Computing capacity for map NameNodeRetryCache16/06/10 16:56:50 INFO util.GSet: VM type = 32-bit16/06/10 16:56:50 INFO util.GSet: 0.029999999329447746% max memory 966.7 MB = 297.0 KB16/06/10 16:56:50 INFO util.GSet: capacity = 2^16 = 65536 entries16/06/10 16:56:50 INFO namenode.AclConfigFlag: ACLs enabled? false16/06/10 16:56:51 INFO namenode.FSImage: Allocated new BlockPoolId: BP-2118425470-127.0.0.1-146560301104616/06/10 16:56:51 INFO common.Storage: Storage directory /data/hadoop/name has been successfully formatted.16/06/10 16:56:51 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 016/06/10 16:56:51 INFO util.ExitUtil: Exiting with status 016/06/10 16:56:51 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1************************************************************/[root@name01 hdfs]# start-all.shThis script is Deprecated. Instead use start-dfs.sh and start-yarn.sh16/06/10 16:57:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableStarting namenodes on [name01]name01: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-name01.outlocalhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-name01.outStarting secondary namenodes [name01]name01: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-name01.out16/06/10 16:57:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicablestarting yarn daemonsstarting resourcemanager, logging to /usr/local/hadoop/logs/yarn-root-resourcemanager-name01.outlocalhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-name01.out[root@name01 hdfs]# jps2913 SecondaryNameNode3412 Jps3061 ResourceManager2742 DataNode2650 NameNode3151 NodeManager[root@name01 hdfs]# hive16/06/10 16:58:09 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive16/06/10 16:58:09 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize16/06/10 16:58:09 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize16/06/10 16:58:09 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack16/06/10 16:58:09 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node16/06/10 16:58:09 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces16/06/10 16:58:09 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative16/06/10 16:58:09 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* insteadLogging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.propertiesSLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/usr/local/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]hive> show tables;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClienthive> show databases;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClienthive> show databases;OKdefaultTime taken: 1.955 seconds, Fetched: 1 row(s)hive> show databases;OKdefaultTime taken: 0.075 seconds, Fetched: 1 row(s)hive> create database hivedb;OKTime taken: 0.673 secondshive> show databases; OKdefaulthivedbTime taken: 0.054 seconds, Fetched: 2 row(s)显示数据库 切换数据库hive> set hive.cli.print.current.db=true;hive (default)> use hivedb;OKTime taken: 0.077 secondshive (hivedb)> <property><name>javax.jdo.option.ConnectionUserName</name><value>hiveuser</value><description>username to use against metastore database</description></property><property><name>javax.jdo.option.ConnectionPassword</name><value>hivepwd</value><description>password to use against metastore database</description></property><property> <name>hive.metastore.uris</name> <value></value> <description>Thrift uri for the remote metastore. Used by metastore client to connect to remote metastore.</description></property><property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://localhost:3306/hivedb?createDatabaseIfNotExist=true</value> <!--jdbc:mysql://localhost:3306/hivedb(hivedb为在mysql中要创建的hive名称)?--> <description>JDBC connect string for a JDBC metastore</description> </property><property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <!--mysql-connector-java-5.1.10-bin.jar 的版本,不需要和mysql的版本一致--> <description>Driver class name for a JDBC metastore</description> </property><property><name>javax.jdo.PersistenceManagerFactoryClass</name><value>org.datanucleus.api.jdo.JDOPersistenceManagerFactory</value><description>class implementing the jdo persistence</description></property>hive (default)> create database mytest;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing queryat org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:252)at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:458)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:477)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getDatabase(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStore.java:632)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:599)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.create_database(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:472)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.createDatabase(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:228)at org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:3432)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:227)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)NestedThrowablesStackTrace:com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'hivedb.DBS' doesn't existat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:243)at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:458)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:477)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getDatabase(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStore.java:632)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:599)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.create_database(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:472)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.createDatabase(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:228)at org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:3432)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:227)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212))hive (default)> show databases; FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDOException: Exception thrown when executing queryat org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)NestedThrowablesStackTrace:com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'hivedb.DBS' doesn't existat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)at org.datanucleus.store.query.Query.execute(Query.java:1654)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212))hive (default)> show databases;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDOException: Exception thrown when executing queryat org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)NestedThrowablesStackTrace:com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'hivedb.DBS' doesn't existat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)at org.datanucleus.store.query.Query.execute(Query.java:1654)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212))hive (default)> show database; NoViableAltException(67@[609:1: ddlStatement : ( createDatabaseStatement | switchDatabaseStatement | dropDatabaseStatement | createTableStatement | dropTableStatement | truncateTableStatement | alterStatement | descStatement | showStatement | metastoreCheck | createViewStatement | dropViewStatement | createFunctionStatement | createMacroStatement | createIndexStatement | dropIndexStatement | dropFunctionStatement | dropMacroStatement | analyzeStatement | lockStatement | unlockStatement | createRoleStatement | dropRoleStatement | grantPrivileges | revokePrivileges | showGrants | showRoleGrants | showRoles | grantRole | revokeRole | setRole | showCurrentRole );])at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)at org.antlr.runtime.DFA.predict(DFA.java:116)at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2030)at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1344)at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:983)at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:434)at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:352)at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:995)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1038)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)FAILED: ParseException line 1:5 cannot recognize input near 'show' 'database' '<EOF>' in ddl statementhive (default)> show databases;FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.hive.metastore.api.MetaException javax.jdo.JDOException: Exception thrown when executing queryat org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212)NestedThrowablesStackTrace:com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'hivedb.DBS' doesn't existat com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)at org.datanucleus.store.query.Query.execute(Query.java:1654)at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)at org.apache.hadoop.hive.metastore.ObjectStore.getDatabases(ObjectStore.java:571)at org.apache.hadoop.hive.metastore.ObjectStore.getAllDatabases(ObjectStore.java:586)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)at com.sun.proxy.$Proxy9.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:848)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)at com.sun.proxy.$Proxy10.get_all_databases(Unknown Source)at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:745)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)at com.sun.proxy.$Proxy11.getAllDatabases(Unknown Source)at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1075)at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2151)at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:328)at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:483)at org.apache.hadoop.util.RunJar.main(RunJar.java:212))hive (default)> exit;[root@name01 hdfs]# hive16/06/10 18:08:28 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive16/06/10 18:08:28 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize16/06/10 18:08:28 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize16/06/10 18:08:28 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack16/06/10 18:08:28 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node16/06/10 18:08:28 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces16/06/10 18:08:28 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative16/06/10 18:08:28 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* insteadLogging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.propertiesSLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/usr/local/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]hive> show databases;OKdefaultTime taken: 3.547 seconds, Fetched: 1 row(s)hive> set hive.cli.print.current.db=true;hive (default)> create table hivetest(id int,name string); OKTime taken: 3.668 secondshive (default)> select * from hivetest;OKTime taken: 0.441 secondshive (default)> SELECT *FROM TBLS\G;FAILED: ParseException line 1:17 character '\' not supported herehive (default)> SELECT *FROM TBLG; FAILED: SemanticException [Error 10001]: Line 1:13 Table not found 'TBLG'hive (default)>
0 0
- hive 元数据 mysql-2
- hive配置mysql元数据
- hive 元数据 mysql-1
- hive元数据存储使用mysql配置
- hive使用mysql存储元数据
- hive安装、配置 mysql存储元数据
- HIVE 通过 MYSQL 保存元数据
- hive设置mysql元数据编码问题
- Hive集成Mysql作为元数据
- hive 元数据为mysql安装
- Hive集成Mysql作为元数据
- hive存储元数据在mysql配置
- Hive集成Mysql作为元数据
- hive元数据配置本地mysql出错
- Hive学习系列-配置mysql元数据
- hive相关元数据迁移(mysql)
- hive 安装mysql作为元数据
- hive 安装mysql作为元数据
- Swift如何取得View所属的ViewController
- Linux 命令小结
- Python 读取GIST Data Set(二)
- CodeForces 366A Dima and Guards (暑期小练习)
- 选数[CQOI2015][bzoj3930]
- hive 元数据 mysql-2
- 第十六周实践项目之阅读程序————2
- 历史数据查询(流程实例、活动、任务、流程变量)
- Spring3.x中各个jar包的作用总结
- linux gzip、bzip2常用压缩、解压缩指令总结
- Android Studio 打包
- Java千百问_08JDK详解(006)_jdk用户界面类库都有什么
- java对象与JSON对象的互换
- 【Java并发编程】之一----1-2-3-4-5-6-7-8-9--10