Currently the writer can only accept BytesRefArrayWritable

来源:互联网 发布:企业版支付宝申请淘宝 编辑:程序博客网 时间:2024/06/05 09:02

今天礼拜一,到公司一堆问题出现,紧急解决之,以下作为笔记。

hive运行sql出现一个问题:

FATAL ExecReducer: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritableat 
org.apache.hadoop.hive.ql.io.RCFile$Writer.append(RCFile.java:880)
at org.apache.hadoop.hive.ql.io.RCFileOutputFormat$2.write(RCFileOutputFormat.java:140)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:588)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(CommonJoinOperator.java:389)at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperator.java:715)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperator.java:697)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperator.java:697)at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:856)
at org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:265)
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:198)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:519)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

通过搜索发现是因为我们创建了一个RCFile的存储表,今天修改了下默认值了,所以导致报此错误。

http://comments.gmane.org/gmane.comp.java.hadoop.hive.user/2849

hive的也将此问题列为issue:

https://issues.apache.org/jira/browse/HIVE-3821

 

解决过程:

ALTER TABLE table_name SET SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'WITH SERDEPROPERTIES('serialization.null.format'='-1')

在使用了上一条语句后,出现的问题。

通过了解,发现SERDE和WITH共同在存储格式是RCfile时,有异常。

如果单独使用,是可行的。如下语句:

ALTER TABLE table_name SET SERDEPROPERTIES('serialization.null.format'='-1')