org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive Intercepting System.exit(1)

来源:互联网 发布:淘宝图片轮播怎么更换 编辑:程序博客网 时间:2024/06/11 05:43

  在 HUE 的 作业设计器中 运行sqoop 作业,调用命令:

sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password mysql-password --table t1 --hive-import

时报错,错误如下:

 Sqoop command arguments :  import  --connect  jdbc:mysql://192.168.7.74:3306/test  --username  test  --password  test  --table  user_info  --hive-import  =================================================================    >>> Invoking Sqoop command line now >>>    3323 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.  3367 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version: 1.4.5-cdh5.2.0  3389 [main] WARN  org.apache.sqoop.tool.BaseSqoopTool  - Setting your password on the command-line is insecure. Consider using -P instead.  3390 [main] INFO  org.apache.sqoop.tool.BaseSqoopTool  - Using Hive-specific delimiters for output. You can override  3390 [main] INFO  org.apache.sqoop.tool.BaseSqoopTool  - delimiters with --fields-terminated-by, etc.  3431 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.  3580 [main] INFO  org.apache.sqoop.manager.SqlManager  - Using default fetchSize of 1000  3586 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code generation  4036 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1  4084 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1  4087 [main] INFO  org.apache.sqoop.orm.CompilationManager  - HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/hadoop-mapreduce  6090 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar file: /tmp/sqoop-yarn/compile/f9a1056980029d03e32f75e1b231f4b5/user_info.jar  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks like you are importing from mysql.  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This transfer can be faster! Use the --direct  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to exercise a MySQL-specific fast path.  6112 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting zero DATETIME behavior to convertToNull (mysql)  6116 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning import of user_info  6155 [main] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is unset. May not be able to find all job dependencies.  6839 [main] INFO  org.apache.sqoop.mapreduce.db.DBInputFormat  - Using read commited transaction isolation  6840 [main] INFO  org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat  - BoundingValsQuery: SELECT MIN(`uid`), MAX(`uid`) FROM `user_info`  Heart beat  33737 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 14 bytes in 27.57 seconds (0.5078 bytes/sec)  33747 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 1 records.  33770 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1  33782 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive  Intercepting System.exit(1)    <<< Invocation of Main class completed <<<    Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]    Oozie Launcher failed, finishing Hadoop job gracefully    Oozie Launcher, uploading action data to HDFS sequence file: hdfs://rhel072:8020/user/admin/oozie-oozi/0000003-141029091352918-oozie-oozi-W/mysqlTableData2hive--sqoop/action-data.seq    Oozie Launcher ends
分析发现 在往 hive 写数据的时候中断了,经测试发现,能往 hive 读数据,不能往 hive 写数据;求大神指点?

0 0