Hive中建立dual表可以方便插入一条手写记录

来源:互联网 发布:js 超链接下载文件 编辑:程序博客网 时间:2024/06/03 06:47

在写Hive SQL偶尔会有一些特殊需要,比如在给表插入一些特定的记录:

hive> create table t_test(a string, b string, c string);

OK

Time taken: 0.068 seconds

hive> insert overwrite table t_test select 'a','b','c';

FailedPredicateException(regularBody,{$s.tree.getChild(1) !=null}?)

at org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:41238)

at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:40413)

at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:40283)

at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1590)

at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1109)

at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)

at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:396)

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)

at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

FAILED: ParseException line 1:48 Failed to recognize predicate '<EOF>'. Failed rule: 'regularBody' in statement


这时,需要一个类似Oracle数据库中的dual表。

下面给出dual表的创建方法:

1. 建表

hive> create table dual (dummy string);

OK

Time taken: 0.223 seconds

2. 在本地目录创建一个文件如dual.txt,该文件只有一条数据,比如 0001

3. Load 数据到dual表中

hive> load data local inpath '/data/test/dual.txt' overwrite into table dual;

Loading data to table dev.dual

Table dev.dual stats: [numFiles=1, numRows=0, totalSize=5, rawDataSize=0]

OK

Time taken: 0.296 seconds

4.查询表

hive> select * from dual;

OK

0001

Time taken: 0.097 seconds, Fetched: 1 row(s)

5.测试

hive> select '1+1' from dual;

OK

1+1

Time taken: 0.061 seconds, Fetched: 1 row(s)

hive> select 1+6 from dual;

OK

7

Time taken: 0.057 seconds, Fetched: 1 row(s)


hive> insert overwrite table t_test select 'a','b','c' from dual;;

Query ID = shengping_20160926211651_7c78cc3e-abfd-4180-8afa-1d3826f0d680

Total jobs = 3

Launching Job 1 out of 3

Number of reduce tasks is set to 0 since there's no reduce operator

Starting Job = job_1474089635085_0001, Tracking URL = http://shengpingdeMacBook-Pro.local:8088/proxy/application_1474089635085_0001/

Kill Command = /Users/shengping/Applications/hadoop-2.7.2/bin/hadoop job  -kill job_1474089635085_0001

Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0

2016-09-26 21:17:03,732 Stage-1 map = 0%,  reduce = 0%

2016-09-26 21:17:08,927 Stage-1 map = 100%,  reduce = 0%

Ended Job = job_1474089635085_0001

Stage-4 is selected by condition resolver.

Stage-3 is filtered out by condition resolver.

Stage-5 is filtered out by condition resolver.

Moving data to: hdfs://localhost:9000/user/hive/warehouse/dev.db/t_test/.hive-staging_hive_2016-09-26_21-16-51_952_6759085087134143013-1/-ext-10000

Loading data to table dev.t_test

Table dev.t_test stats: [numFiles=1, numRows=1, totalSize=6, rawDataSize=5]

MapReduce Jobs Launched: 

Stage-Stage-1: Map: 1   HDFS Read: 3152 HDFS Write: 72 SUCCESS

Total MapReduce CPU Time Spent: 0 msec

OK

Time taken: 18.314 seconds










0 0