sqoop导入mysql到HDFS时报错:java.lang.NoClassDefFoundError: org/json/JSONObject

来源:互联网 发布:linux安装软件步骤 编辑:程序博客网 时间:2024/03/29 02:49

问题1:sqoop import导入时报java.lang.ClassNotFoundException: org.json.JSONObject 错误

[root@hadoop1 lib]# sqoop import --connect jdbc:mysql://10.1.32.8:3306/test --username sqoop --password sqoop --table t1 -m 116/06/07 08:48:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.016/06/07 08:48:59 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.16/06/07 08:48:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.16/06/07 08:48:59 INFO tool.CodeGenTool: Beginning code generation16/06/07 08:48:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t1` AS t LIMIT 116/06/07 08:48:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `t1` AS t LIMIT 116/06/07 08:48:59 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduceNote: /tmp/sqoop-root/compile/07751371c513f90a6377d7b482c4a910/t1.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.16/06/07 08:49:01 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/07751371c513f90a6377d7b482c4a910/t1.jar16/06/07 08:49:01 WARN manager.MySQLManager: It looks like you are importing from mysql.16/06/07 08:49:01 WARN manager.MySQLManager: This transfer can be faster! Use the --direct16/06/07 08:49:01 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.16/06/07 08:49:01 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)16/06/07 08:49:01 INFO mapreduce.ImportJobBase: Beginning import of t1Exception in thread "main" java.lang.NoClassDefFoundError: org/json/JSONObject        at org.apache.sqoop.util.SqoopJsonUtil.getJsonStringforMap(SqoopJsonUtil.java:42)        at org.apache.sqoop.SqoopOptions.writeProperties(SqoopOptions.java:742)        at org.apache.sqoop.mapreduce.JobBase.putSqoopOptionsToConfiguration(JobBase.java:369)        at org.apache.sqoop.mapreduce.JobBase.createJob(JobBase.java:355)        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:249)        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)Caused by: java.lang.ClassNotFoundException: org.json.JSONObject        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)        at java.security.AccessController.doPrivileged(Native Method)        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)        ... 15 more


解决:

这是因为sqoop缺少java-json.jar包.

如下面链接所说:

http://stackoverflow.com/questions/27504508/java-lang-noclassdeffounderror-org-json-jsonobject

 

The Exception it self says itall java.lang.ClassNotFoundException: org.json.JSONObject

You have not added thenecessary jar file which will be having org.json.JSONObject class to yourclasspath.

 

下载java-json.jar包:

http://www.java2s.com/Code/Jar/j/Downloadjavajsonjar.htm

 

把java-json.jar添加到../sqoop/lib目录:

# cp java-json.jar/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/sqoop/lib

[hdfs@hadoop1 lib]# pwd

/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/sqoop/lib

[hdfs@hadoop1 lib]#

[hdfs@hadoop1 lib]$ ll java-json.jar

-rw-r--r-- 1 root root 84697 Oct 16  2013 java-json.jar



问题2:root用户写入HDFS文件错误 "Permission denied: user=root"

[root@hadoop1 lib]# sqoop import --connect jdbc:mysql://10.1.32.8:3306/test --username sqoop --password sqoop --table t1 -m 116/06/07 08:49:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.016/06/07 08:49:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.... ...16/06/07 08:49:52 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)16/06/07 08:49:52 INFO mapreduce.ImportJobBase: Beginning import of t116/06/07 08:49:53 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar16/06/07 08:49:53 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps16/06/07 08:49:53 INFO client.RMProxy: Connecting to ResourceManager at hadoop0.hadoop.com/10.1.32.239:803216/06/07 08:49:54 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6524)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4322)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4292)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4265)        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:867)        at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:322)        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:603)        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)16/06/07 08:49:54 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6524)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4322)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4292)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4265)        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:867)        at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:322)        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:603)        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6524)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4322)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4292)        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4265)        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:867)        at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:322)        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:603)        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)        at org.apache.hadoop.ipc.Client.call(Client.java:1471)        at org.apache.hadoop.ipc.Client.call(Client.java:1408)        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)        at com.sun.proxy.$Proxy15.mkdirs(Unknown Source)        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:544)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)        at com.sun.proxy.$Proxy16.mkdirs(Unknown Source)        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3082)        ... 28 more


解决:
因为在使用sqoop导入HDFS文件时,使用的是root用户,没有写hdfs文件的权限。
CDH安装时创建了hdfs用户,应使用hdfs用户登录,再时行sqoop导入
[root@hadoop1 ~]# su - hdfs
[hdfs@hadoop1 ~]$ pwd
/var/lib/hadoop-hdfs
[hdfs@hadoop1 ~]$ ls
t1.java


问题3:其它hadoop节点不连接MySql "is not allowed to connect to this MySQL server"

[hdfs@hadoop1 ~]$ sqoop import --connect jdbc:mysql://10.1.32.8:3306/test --username sqoop --password sqoop --table t1 -m 116/06/07 08:51:52 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0......16/06/07 08:52:00 INFO mapreduce.Job: The url to track the job: http://hadoop0.hadoop.com:8088/proxy/application_1464249387420_0001/16/06/07 08:52:00 INFO mapreduce.Job: Running job: job_1464249387420_000116/06/07 08:52:08 INFO mapreduce.Job: Job job_1464249387420_0001 running in uber mode : false16/06/07 08:52:08 INFO mapreduce.Job:  map 0% reduce 0%16/06/07 08:52:14 INFO mapreduce.Job: Task Id : attempt_1464249387420_0001_m_000000_0, Status : FAILEDError: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: null,  message from server: "Host 'hadoop4.hadoop.com' is not allowed to connect to this MySQL server"        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:749)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)Caused by: java.lang.RuntimeException: java.sql.SQLException: null,  message from server: "Host 'hadoop4.hadoop.com' is not allowed to connect to this MySQL server"        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)        ... 9 moreCaused by: java.sql.SQLException: null,  message from server: "Host 'hadoop4.hadoop.com' is not allowed to connect to this MySQL server"


解决:
这是因为客户端没有访问mysql的权限,修改sqoop用户的客户端访问权限。
--登录mysql服务器
[root@hadoop2 ~]# mysql -uroot -proot 
mysql> grant all privileges on *.* to 'sqoop'@'%' identified by 'sqoop' with grant option;
Query OK, 0 rows affected (0.00 sec)









0 0
原创粉丝点击