文本数据导入HBASE库找不到类com/google/common/collect/Multimap
来源:互联网 发布:数据资源共享合作协议 编辑:程序博客网 时间:2024/05/21 09:38
文本数据导入HBASE库找不到类com/google/common/collect/Multimap
打算将文本文件导入HBASE库,在运行命令的时候找不到类com/google/common/collect/Multima
[hadoop@hadoop1 lib]$ hadoop jar /home/hadoop/hbase-0.94.6/hbase-0.94.6.jar importtsv
Warning: $HADOOP_HOME is deprecated.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/Multimap
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Multimap
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 6 more
后来将$HBASE_HOME/lib下的包guava-11.0.2.jar 复制到 $HADOOP_HOME/lib 下, 问题搞定。
[hadoop@hadoop1 lib]$ pwd
/home/hadoop/hadoop-1.0.4/lib
[hadoop@hadoop1 lib]$ cp /home/hadoop/hbase-0.94.6/lib/guava-11.0.2.jar .
[hadoop@hadoop1 lib]$ hadoop jar /home/hadoop/hbase-0.94.6/hbase-0.94.6.jar importtsv
Warning: $HADOOP_HOME is deprecated.
ERROR: Wrong number of arguments: 0
Usage: importtsv -Dimporttsv.columns=a,b,c <tablename> <inputdir>
Imports the given input directory of TSV data into the specified table.
The column names of the TSV data must be specified using the -Dimporttsv.columns
option. This option takes the form of comma-separated column names, where each
column name is either a simple column family, or a columnfamily:qualifier. The special
column name HBASE_ROW_KEY is used to designate that this column should be used
as the row key for each imported record. You must specify exactly one column
to be the row key, and you must specify a column name for every column that exists in the
input data. Another special column HBASE_TS_KEY designates that this column should be
used as timestamp for each record. Unlike HBASE_ROW_KEY, HBASE_TS_KEY is optional.
You must specify atmost one column as timestamp key for each imported record.
Record with invalid timestamps (blank, non-numeric) will be treated as bad record.
Note: if you use this option, then 'importtsv.timestamp' option will be ignored.
By default importtsv will load data directly into HBase. To instead generate
HFiles of data to prepare for a bulk data load, pass the option:
-Dimporttsv.bulk.output=/path/for/output
Note: if you do not use this option, then the target table must already exist in HBase
Other options that may be specified with -D include:
-Dimporttsv.skip.bad.lines=false - fail if encountering an invalid line
'-Dimporttsv.separator=|' - eg separate on pipes instead of tabs
-Dimporttsv.timestamp=currentTimeAsLong - use the specified timestamp for the import
-Dimporttsv.mapper.class=my.Mapper - A user-defined Mapper to use instead of org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
For performance consider the following options:
-Dmapred.map.tasks.speculative.execution=false
-Dmapred.reduce.tasks.speculative.execution=false
[hadoop@hadoop1 lib]$ hadoop jar /home/hadoop/hbase-0.94.6/hbase-0.94.6.jar importtsv
Warning: $HADOOP_HOME is deprecated.
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/Multimap
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Multimap
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 6 more
后来将$HBASE_HOME/lib下的包guava-11.0.2.jar 复制到 $HADOOP_HOME/lib 下, 问题搞定。
[hadoop@hadoop1 lib]$ pwd
/home/hadoop/hadoop-1.0.4/lib
[hadoop@hadoop1 lib]$ cp /home/hadoop/hbase-0.94.6/lib/guava-11.0.2.jar .
[hadoop@hadoop1 lib]$ hadoop jar /home/hadoop/hbase-0.94.6/hbase-0.94.6.jar importtsv
Warning: $HADOOP_HOME is deprecated.
ERROR: Wrong number of arguments: 0
Usage: importtsv -Dimporttsv.columns=a,b,c <tablename> <inputdir>
Imports the given input directory of TSV data into the specified table.
The column names of the TSV data must be specified using the -Dimporttsv.columns
option. This option takes the form of comma-separated column names, where each
column name is either a simple column family, or a columnfamily:qualifier. The special
column name HBASE_ROW_KEY is used to designate that this column should be used
as the row key for each imported record. You must specify exactly one column
to be the row key, and you must specify a column name for every column that exists in the
input data. Another special column HBASE_TS_KEY designates that this column should be
used as timestamp for each record. Unlike HBASE_ROW_KEY, HBASE_TS_KEY is optional.
You must specify atmost one column as timestamp key for each imported record.
Record with invalid timestamps (blank, non-numeric) will be treated as bad record.
Note: if you use this option, then 'importtsv.timestamp' option will be ignored.
By default importtsv will load data directly into HBase. To instead generate
HFiles of data to prepare for a bulk data load, pass the option:
-Dimporttsv.bulk.output=/path/for/output
Note: if you do not use this option, then the target table must already exist in HBase
Other options that may be specified with -D include:
-Dimporttsv.skip.bad.lines=false - fail if encountering an invalid line
'-Dimporttsv.separator=|' - eg separate on pipes instead of tabs
-Dimporttsv.timestamp=currentTimeAsLong - use the specified timestamp for the import
-Dimporttsv.mapper.class=my.Mapper - A user-defined Mapper to use instead of org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
For performance consider the following options:
-Dmapred.map.tasks.speculative.execution=false
-Dmapred.reduce.tasks.speculative.execution=false
0 0
- 文本数据导入HBASE库找不到类com/google/common/collect/Multimap
- 文本数据导入HBASE库找不到类com/google/common/collect/Multimap
- com.google.common.collect.ImmutableSet
- 文本数据导入HBASE
- Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Lists
- import com.google.common.* 出错,找不到
- hive over hbase方式将文本库数据导入hbase
- guava中Range的使用方法(com.google.common.collect.Range)
- java.lang.NoSuchMethodError: com.google.common.collect.Sets.newConcurrentHashSet()异常解决思路
- JBOSS报错:com.google.common.collect.ComputationException: java.lang.ArrayIndexOutOfBoundsException: 3
- 如何解决sikuli-ide: java.lang.NoClassDefFoundError: com/google/common/collect/MapMaker
- 【软件测试】Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.ImmutableSet
- java Arrays.asList com.google.common.collect.Lists.newArrayList 效率问题
- android com.google.common查看源码是找不到这个包
- HBASE: java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>
- hbase Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.ThreadFactoryBu
- MapReduce将HDFS文本数据导入HBase中
- Hadoop MapReduce将HDFS文本数据导入HBase
- web.xml报错
- fedora9 换源方法
- html中文部乱码,jsp全部乱码。
- Delphi备忘录——数据类型
- Visual C++ 开发书籍
- 文本数据导入HBASE库找不到类com/google/common/collect/Multimap
- linux 下搭建 ftp
- 关于ie6的a标签的那点事
- 字符串操作
- springMVC无法访问JSP报404,但是又能访问controller
- 如何与人交流——程序员,赶紧生个孩子吧!
- 动态规划算法例题及解析
- 物理卷、卷组、逻辑卷、硬盘分区等概念介绍
- NYOJ 13 Fibonacci数