sqoop错误解决

来源:互联网 发布:configparser python 编辑:程序博客网 时间:2024/06/04 18:15
 ./sqoop import --connect jdbc:mysql://localhost:3306/xxxx  --username dba  --password  123456  --direct --table ehm_hosts  --target-dir /data/ehm_hosts -m1         


出现错误:

 

java.net.ConnectExceptionMESSAGE: Connection refusedSTACKTRACE:java.net.ConnectException: Connection refused        at java.net.PlainSocketImpl.socketConnect(Native Method)        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)        at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)        at java.net.Socket.connect(Socket.java:529)        at java.net.Socket.connect(Socket.java:478)        at java.net.Socket.<init>(Socket.java:375)        at java.net.Socket.<init>(Socket.java:218)        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:173)        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:267)        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2739)        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)        at java.sql.DriverManager.getConnection(DriverManager.java:582)        at java.sql.DriverManager.getConnection(DriverManager.java:185)        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:723)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:396)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)        at org.apache.hadoop.mapred.Child.main(Child.java:249)** END NESTED EXCEPTION **Last packet sent to the server was 22 ms ago.        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:193)        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)        ... 9 moreCaused by: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception: ** BEGIN NESTED EXCEPTION ** java.net.ConnectExceptionMESSAGE: Connection refusedSTACKTRACE:java.net.ConnectException: Connection refused        at java.net.PlainSocketImpl.socketConnect(Native Method)        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)        at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)        at java.net.Socket.connect(Socket.java:529)        at java.net.Socket.connect(Socket.java:478)        at java.net.Socket.<init>(Socket.java:375)        at java.net.Socket.<init>(Socket.java:218)        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:173)        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:267)        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2739)        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)        at java.sql.DriverManager.getConnection(DriverManager.java:582)        at java.sql.DriverManager.getConnection(DriverManager.java:185)        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:723)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:396)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)        at org.apache.hadoop.mapred.Child.main(Child.java:249)** END NESTED EXCEPTION **Last packet sent to the server was 22 ms ago.        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2814)        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)        at java.sql.DriverManager.getConnection(DriverManager.java:582)        at java.sql.DriverManager.getConnection(DriverManager.java:185)        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)        ... 10 more


 

换成:

./sqoop import --connect jdbc:mysql://192.168.205.101:3306/xxxx  --username dba  --password  123456  --direct --table ehm_hosts  --target-dir /data/ehm_hosts -m1         


问题解决!

 

出现问题:

3/06/26 00:37:18 INFO mapred.JobClient: Task Id : attempt_201306250027_0021_m_000000_0, Status : FAILEDjava.io.IOException: Cannot run program "mysqldump": java.io.IOException: error=2, No such file or directory        at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)        at java.lang.Runtime.exec(Runtime.java:593)        at java.lang.Runtime.exec(Runtime.java:466)        at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:403)        at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:47)        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:396)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)        at org.apache.hadoop.mapred.Child.main(Child.java:249)Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory        at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)        at java.lang.ProcessImpl.start(ProcessImpl.java:65)        at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)        ... 12 more


解决方案:

看mR日志.

50027_0021_m_000000_0task_201306250027_0021_m_000000slave1FAILEDjava.io.IOException: Cannot run program "mysqldump": java.io.IOException: error=2, No such file or directoryat java.lang.ProcessBuilder.start(ProcessBuilder.java:460)at java.lang.Runtime.exec(Runtime.java:593)at java.lang.Runtime.exec(Runtime.java:466)at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:403)at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:47)at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)at org.apache.hadoop.mapred.Child$4.run(Child.java:255)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:396)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)at org.apache.hadoop.mapred.Child.main(Child.java:249)Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directoryat java.lang.UNIXProcess.<init>(UNIXProcess.java:148)at java.lang.ProcessImpl.start(ProcessImpl.java:65)at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)... 12 more


导出数据在slave1上,所以需要在slave1上安装mysqldump.安装后,执行成功。

Warning: /usr/lib/hbase does not exist! HBase imports will fail.Please set $HBASE_HOME to the root of your HBase installation.Warning: $HADOOP_HOME is deprecated.13/06/26 00:51:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.13/06/26 00:51:23 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.13/06/26 00:51:23 INFO tool.CodeGenTool: Beginning code generation13/06/26 00:51:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `ehm_hosts` AS t LIMIT 113/06/26 00:51:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `ehm_hosts` AS t LIMIT 113/06/26 00:51:23 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usrNote: /tmp/sqoop-hadoop/compile/a067fc87107ca67800cb30e3e4bd56f9/ehm_hosts.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.13/06/26 00:51:27 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/a067fc87107ca67800cb30e3e4bd56f9/ehm_hosts.jar13/06/26 00:51:27 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import13/06/26 00:51:27 INFO mapreduce.ImportJobBase: Beginning import of ehm_hosts13/06/26 00:51:30 INFO mapred.JobClient: Running job: job_201306250027_002313/06/26 00:51:31 INFO mapred.JobClient:  map 0% reduce 0%13/06/26 00:51:51 INFO mapred.JobClient:  map 100% reduce 0%13/06/26 00:51:56 INFO mapred.JobClient: Job complete: job_201306250027_002313/06/26 00:51:56 INFO mapred.JobClient: Counters: 1813/06/26 00:51:56 INFO mapred.JobClient:   Job Counters 13/06/26 00:51:56 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=1962213/06/26 00:51:56 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=013/06/26 00:51:56 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=013/06/26 00:51:56 INFO mapred.JobClient:     Launched map tasks=113/06/26 00:51:56 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=013/06/26 00:51:56 INFO mapred.JobClient:   File Output Format Counters 13/06/26 00:51:56 INFO mapred.JobClient:     Bytes Written=33213/06/26 00:51:56 INFO mapred.JobClient:   FileSystemCounters13/06/26 00:51:56 INFO mapred.JobClient:     HDFS_BYTES_READ=8713/06/26 00:51:56 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=3192913/06/26 00:51:56 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=33213/06/26 00:51:56 INFO mapred.JobClient:   File Input Format Counters 13/06/26 00:51:56 INFO mapred.JobClient:     Bytes Read=013/06/26 00:51:56 INFO mapred.JobClient:   Map-Reduce Framework13/06/26 00:51:56 INFO mapred.JobClient:     Map input records=113/06/26 00:51:56 INFO mapred.JobClient:     Physical memory (bytes) snapshot=6253772813/06/26 00:51:56 INFO mapred.JobClient:     Spilled Records=013/06/26 00:51:56 INFO mapred.JobClient:     CPU time spent (ms)=136013/06/26 00:51:56 INFO mapred.JobClient:     Total committed heap usage (bytes)=1625292813/06/26 00:51:56 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=41780019213/06/26 00:51:56 INFO mapred.JobClient:     Map output records=313/06/26 00:51:56 INFO mapred.JobClient:     SPLIT_RAW_BYTES=8713/06/26 00:51:56 INFO mapreduce.ImportJobBase: Transferred 332 bytes in 28.6117 seconds (11.6037 bytes/sec)13/06/26 00:51:56 INFO mapreduce.ImportJobBase: Retrieved 3 records.


 

 

 

 

问题:

Note: Recompile with -Xlint:deprecation for details.13/06/26 00:49:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/2a8b237421375fb4414d761cf7c7c998/ehm_hosts.jar13/06/26 00:49:52 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import13/06/26 00:49:52 INFO mapreduce.ImportJobBase: Beginning import of ehm_hosts13/06/26 00:49:54 INFO mapred.JobClient: Cleaning up the staging area hdfs://master:9000/tmp/hadoop-hadoop/mapred/staging/hadoop/.staging/job_201306250027_002213/06/26 00:49:54 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /data/ehm_hosts already exists13/06/26 00:49:54 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /data/ehm_hosts already exists        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:887)        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:396)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)        at org.apache.sqoop.manager.DirectMySQLManager.importTable(DirectMySQLManager.java:92)        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)


 

解决方案:

由于目录在hdfs上建立了,必须删除。

 hadoop fs -rmr  /data/ehm_hosts