lzo的安装及在hadoop中的配置

来源:互联网 发布:网络小胖的老婆 编辑:程序博客网 时间:2024/05/16 12:22

一、前提

1、gcc的安装:yum install lib* glibc* gcc* 如果lzo编译出错时可能需要安装

2、ant的安装:安装略,最好1.8.2及以上版本,并设置好环境变量 在第三步需要用到ant

二、lzo的安装

wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.06.tar.gz 

./configure --enable-shared

make

make install

#编辑/etc/ld.so.conf,加入/usr/local/lib/后,执行/sbin/ldconfig

或者cp /usr/local/lib/liblzo2.* /usr/lib64/

三、lzo编码/解码器的安装:

wget https://download.github.com/kevinweil-hadoop-lzo-2ad6654.tar.gz

tar-zxvf kevinweil-hadoop-lzo-2ad6654.tar.gz

cd kevinweil-hadoop-lzo-2ad6654

ant compile-native tar

#将本地库以及Jar包拷贝到hadoop对应的目录下,并分发到各节点上

cp lib/native/Linux-amd64-64/* /opt/hadoop/hadoop/lib/native/Linux-amd64-64/  #32位系统则是32位的路径目录

cp hadoop-lzo-0.4.10.jar .../hadoop/lib/

第二中安装方式:

https://github.com/twitter/hadoop-lzo/tree/release-0.4.19 在twitter上找到安装包

解压并修改pom文件,主要是hadoop的版本

然后执行mvn clean package-Dmaven.test.skip=true 搞定。

四、hadoop配置

vi core-site.xml

<property>     <name>io.compression.codecs</name>     <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.BZip2Codec</value></property><property>     <name>io.compression.codec.lzo.class</name>     <value>com.hadoop.compression.lzo.LzoCodec</value></property>

vi mapred-site.xml

    <property>    <name>mapred.compress.map.output</name>         <value>true</value>       </property>       <property>         <name>mapred.map.output.compression.codec</name>          <value>com.hadoop.compression.lzo.LzoCodec</value>       </property>

以上在hadoop-0.20.203.0中测试通过,但在204.0和205.0中会出现如下问题:

hadoop-0.20.204.0中报错:

2012-01-06 12:09:30,475 ERROR lzo.GPLNativeCodeLoader (GPLNativeCodeLoader.java:<clinit>(36)) - Could not load native gpl library
java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1028)
        at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32)
        at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:67)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:819)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.init(RCFile.java:1109)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.<init>(RCFile.java:983)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.<init>(RCFile.java:964)
        at org.apache.hadoop.hive.ql.io.RCFileRecordReader.<init>(RCFileRecordReader.java:52)
        at org.apache.hadoop.hive.ql.io.RCFileInputFormat.getRecordReader(RCFileInputFormat.java:57)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:306)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:320)
        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:133)
        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1114)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
2012-01-06 12:09:30,476 ERROR lzo.LzoCodec (LzoCodec.java:<clinit>(77)) - Cannot load native-lzo without native-hadoop
2012-01-06 12:09:30,478 ERROR CliDriver (SessionState.java:printError(343)) - Failed with exception java.io.IOException:java.lang.RuntimeException: native-lzo library not available
java.io.IOException: java.lang.RuntimeException: native-lzo library not available
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:341)
        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:133)
        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1114)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeException: native-lzo library not available
        at com.hadoop.compression.lzo.LzoCodec.getDecompressorType(LzoCodec.java:180)
        at org.apache.hadoop.hive.ql.io.CodecPool.getDecompressor(CodecPool.java:122)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.init(RCFile.java:1117)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.<init>(RCFile.java:983)
        at org.apache.hadoop.hive.ql.io.RCFile$Reader.<init>(RCFile.java:964)
        at org.apache.hadoop.hive.ql.io.RCFileRecordReader.<init>(RCFileRecordReader.java:52)
        at org.apache.hadoop.hive.ql.io.RCFileInputFormat.getRecordReader(RCFileInputFormat.java:57)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:306)
        at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:320)
        ... 10 more

解决办法:在$hadoop_home/bin/hadoop文件中增加如下一行即可JAVA_LIBRARY_PATH=$hadoop_home/lib/native/Linux-amd64-64

在hadoop-0.20.205版本中处了该错误外,还有一个错误,就是找不到lzo的jar包,这是因为他们家在classpath的方法有了变更,默认不会把$hadoop_home/lib目录下的所有jar包都加载,所以在/conf/hadoop-env.sh中增加如下代码即可解决:export  HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$hadoop_home/lib/hadoop-lzo.jar

原创粉丝点击