max key length is 767 bytes

来源:互联网 发布:淘宝客合法吗 编辑:程序博客网 时间:2024/05/09 08:38
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes错误描述:hive> load data local inpath "/root/partition_table.dat" into table partition_table partition(class="job1");Loading data to table mydb.partition_table partition (class=job1)Failed with exception MetaException(message:javax.jdo.JDODataStoreException: Error(s) were found while auto-creating/validating the datastore for classes. The errors are printed in the log, and are attached to this exception.        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)        at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)        at org.apache.hadoop.hive.metastore.ObjectStore.addPartition(ObjectStore.java:1431)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:98)        at com.sun.proxy.$Proxy5.addPartition(Unknown Source)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_common(HiveMetaStore.java:1888)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_with_environment_context(HiveMetaStore.java:1943)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)        at com.sun.proxy.$Proxy6.append_partition_with_environment_context(Unknown Source)        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:531)        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:525)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)        at com.sun.proxy.$Proxy7.appendPartition(Unknown Source)        at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1730)        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1349)        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1269)        at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:428)        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)NestedThrowablesStackTrace:com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes        at sun.reflect.GeneratedConstructorAccessor49.newInstance(Unknown Source)        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)        at com.mysql.jdbc.Util.getInstance(Util.java:386)        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)        at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:813)        at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:656)        at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)        at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)        at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:648)        at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:422)        at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3459)        at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)        at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)        at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)        at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)        at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)        at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)        at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)        at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)        at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)        at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)        at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)        at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)        at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)        at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)        at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)        at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)        at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)        at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)        at org.apache.hadoop.hive.metastore.ObjectStore.addPartition(ObjectStore.java:1431)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:98)        at com.sun.proxy.$Proxy5.addPartition(Unknown Source)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_common(HiveMetaStore.java:1888)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.append_partition_with_environment_context(HiveMetaStore.java:1943)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)        at com.sun.proxy.$Proxy6.append_partition_with_environment_context(Unknown Source)        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:531)        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.appendPartition(HiveMetaStoreClient.java:525)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)        at com.sun.proxy.$Proxy7.appendPartition(Unknown Source)        at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1730)        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1349)        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1269)        at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:428)        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)        at org.apache.hadoop.util.RunJar.main(RunJar.java:136))FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask==============================================================该方法没解决我问题 start==================================================网上找的这个方法没有解决我的问题解决方法:修改mysql.ini 文件:default-character-set=utf8character-set-server=utf8修改完后,重启mysql的服务,service mysql restart使用 mysql> SHOW VARIABLES LIKE 'character%';查看修改之前的:mysql> SHOW VARIABLES LIKE 'character%';+--------------------------+---------------------------------------------------------+| Variable_name            | Value                                                   |+--------------------------+---------------------------------------------------------+| character_set_client     | gbk                                                     || character_set_connection | gbk                                                     || character_set_database   | gbk                                                     || character_set_filesystem | binary                                                  || character_set_results    | gbk                                                     || character_set_server     | gbk                                                     || character_set_system     | utf8                                                    || character_sets_dir       | D:\Program Files\MySQL\MySQL Server 5.5\share\charsets\ |+--------------------------+---------------------------------------------------------+8 rows in set (0.00 sec)mysql>//修改之后的mysql> SHOW VARIABLES LIKE 'character%';+--------------------------+---------------------------------------------------------+| Variable_name            | Value                                                   |+--------------------------+---------------------------------------------------------+| character_set_client     | utf8                                                    || character_set_connection | utf8                                                    || character_set_database   | utf8                                                    || character_set_filesystem | binary                                                  || character_set_results    | utf8                                                    || character_set_server     | utf8                                                    || character_set_system     | utf8                                                    || character_sets_dir       | D:\Program Files\MySQL\MySQL Server 5.5\share\charsets\ |+--------------------------+---------------------------------------------------------+8 rows in set (0.00 sec)mysql>==============================================================该方法没解决我问题 stop==============================================================================================================下面这个方法解决了我的问题 start======================================//解决方法:在mysql机器的上运行:alter database hive character set latin1;//在Windows上的mysql上面执行:mysql> alter database hive character set latin1;Query OK, 1 row affected (0.02 sec)mysql>//hive加载分区表数据的语句成功执行了:hive> load data local inpath "/root/aa.dat" into table t5 partition(class="job1");Loading data to table default.t5 partition (class=job1)Partition default.t5{class=job1} stats: [numFiles=2, numRows=0, totalSize=36, rawDataSize=0]OKTime taken: 3.931 secondshive>============================================================下面这个方法解决了我的问题 stop======================================

0 0
原创粉丝点击