hadoop-2.6.4集群编译搭建-阿里云和腾讯云
来源:互联网 发布:js获取文本 编辑:程序博客网 时间:2024/05/20 06:28
腾讯云阿里云 hadoop集群编译搭建
环境准备
阿里云配置:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
腾讯云配置:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
创建用户
useradd Hadoop
passwd haddop
jdk1.7安装:
下载:http://www.oracle.com/technetwork/java/javase/downloads/java-archive-downloads-javase7-521261.html#jdk-7u80-oth-JPR
wget http://download.oracle.com/otn/java/jdk/7u80-b15/jdk-7u80-linux-x64.tar.gz?AuthParam=1469844164_7ce09e1f99570835183215c3510e95e0
mv jdk-7u80-Linux-x64.tar.gz\?AuthParam\=1469844164_7ce09e1f99570835183215c3510e95e0 jdk-7u80-linux-x64.tar.gz
配置jdk tar zxf jdk-7u80-linux-x64.tar.gz -C /opt/
配置环境变量
- 1
- 2
- 3
- 4
- 5
- 1
- 2
- 3
- 4
- 5
生效: source /etc/profile
编译hadoop2.6.4所需软件
yum install gcc cmake gcc-c++
安装maven
wget http://www-eu.apache.org/dist/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
安装maven:http://www.blogjava.net/caojianhua/archive/2011/04/02/347559.html
- 1
- 2
- 3
- 4
- 1
- 2
- 3
- 4
source /etc/profile
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 1
- 2
- 3
- 4
- 5
- 6
- 7
安装protobuf
要求版本protobuf-2.5.0
wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 1
- 2
- 3
- 4
- 5
- 6
- 7
protoc --version
安装ant
wget http://www-eu.apache.org/dist//ant/binaries/apache-ant-1.9.7-bin.tar.gz
- 1
- 2
- 3
- 4
- 1
- 2
- 3
- 4
source /etc/profile
ant -version
Apache Ant(TM) version 1.9.7 compiled on April 9 2016
- 1
- 2
- 1
- 2
安装findbugs
http://findbugs.sourceforge.net/downloads.html
wget http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download mv findbugs-3.0.1.tar.gz\?download findbugs-3.0.1.tar.gz
- 1
- 2
- 3
- 4
- 5
- 6
- 1
- 2
- 3
- 4
- 5
- 6
hadoop编译安装:
下载hadoop:http://hadoop.apache.org/releases.html
wget http://www-eu.apache.org/dist/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
tar zxf hadoop-2.6.4-src.tar.gz
cd hadoop-2.6.4-src
more BUILDING.txt
查看如何编译安装
mvn clean package -Pdist,native,docs -DskipTests -Dtar
编译过程中,需要下载很多包,等待时间比较长。当看到hadoop各个项目都编译成功,即出现一系列的SUCCESS之后,即为编译成功。
有些包下载卡住,重复执行上面的命令,或可以根据提示到相应的网址(https://repo.maven.apache.org/maven2)下载放到指定位置
出现错误1: [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: input file /home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-common/target/findbugsXml.xml does not exist
[ERROR] around Ant part ...<xslt style="/usr/local/findbugs-3.0.1/src/xsl/default.xsl" in="/home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-common/target/findbugsXml.xml" out="/home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-common/target/site/findbugs.html"/>... @ 44:256 in /home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
解决办法1:
参考:http://www.itnose.net/detail/6143808.html
从该命令删除docs参数再运行mvn package -Pdist,native -DskipTests -Dtar
出现错误2: [INFO] Executing tasks
main:
[mkdir] Created dir: /home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-kms/downloads
[get] Getting: http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.41/bin/apache-tomcat-6.0.41.tar.gz
[get] To: /home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-kms/downloads/apache-tomcat-6.0.41.tar.gz
解决2:
卡在这里,应该是不能下载
本地翻墙下载上传到指定位置
出现错误3: [ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on project hadoop-hdfs: MavenReportException: Error while creating archive:
[ERROR] ExcludePrivateAnnotationsStandardDoclet
[ERROR] Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000f31a4000, 130400256, 0) failed; error='Cannot allocate memory' (errno=12)
[ERROR] #
[ERROR] # There is insufficient memory for the Java Runtime Environment to continue.
[ERROR] # Native memory allocation (malloc) failed to allocate 130400256 bytes for committing reserved memory.
[ERROR] # An error report file with more information is saved as:
[ERROR] # /home/hadoop/hadoop-2.6.4-src/hadoop-hdfs-project/hadoop-hdfs/target/hs_err_pid24729.log
[ERROR]
[ERROR] Error occurred during initialization of VM, try to reduce the Java heap size for the MAVEN_OPTS environnement variable using -Xms:<size> and -Xmx:<size>.
[ERROR] Or, try to reduce the Java heap size for the Javadoc goal using -Dminmemory=<size> and -Dmaxmemory=<size>.
解决3:
应该是内存不够,没有分配swap
添加2G swap分区
添加或扩大交换分区
dd if=/dev/zero of=/home/swap bs=512 count=4096000
bs 是扇区大小 bs=512 指大小为512B count为扇区数量
表示创建一个大小为4G 的文件 /home/swap 用空值填充。of位置可以自己调整。
查看当前分区的大小
free -m
格式化并挂载
mkswap /home/swap
swapon /home/swap
查看挂载情况
swapon -s
开机自动挂载
vim /etc/fstab
/home/swap swap swap defaults 0 0
想写在分区
swapoff /home/swap
出现问题4: main:
[mkdir] Created dir: /home/hadoop/hadoop-2.6.4-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads
[get] Getting: http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.41/bin/apache-tomcat-6.0.41.tar.gz
[get] To: /home/hadoop/hadoop-2.6.4-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/apache-tomcat-6.0.41.tar.gz
解决4:
网络问题,同上 cp /home/hadoop/hadoop-2.6.4-src/hadoop-common-project/hadoop-kms/downloads/apache-tomcat-6.0.41.tar.gz /home/hadoop/hadoop-2.6.4-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/
编译安装成功
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
在hadoop-dist/target/
已经生成了可执行文件
拷贝到用户目录 cp -r hadoop-2.6.4 ~/
配置
两台机免密互通
ssh-keygen
各自生成的公钥放到另一台机上 cat id_rsa_else.pub >> authorized_keys
chmod 600 authorized_keys
网络规划:
hadoop1 123.206.33.182 slave
hadoop0 114.215.92.77 master
配置hosts
vim /etc/hosts
123.206.33.182 hadoop1 tx lizer_tx
114.215.92.77 hadoop0 ali lizer_ali
配置环境变量
vim /etc/profile
export HADOOP_HOME=/home/hadoop/hadoop-2.6.4
export PATH=$HADOOP_HOME/bin:$PATH
Hadoop配置
配置文件放在$HADOOP_HOME/etc/hadoop/
下
修改一下配置:
vim hadoop-env.sh export JAVA_HOME=/opt/jdk1.7.0_80
vim yarn-env.sh export JAVA_HOME=/opt/jdk1.7.0_80
vim slaves (这里没有了master配置文件) hadoop1
vim core-site.xml
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
vim hdfs-site.xml
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
cp mapred-site.xml.template mapred-site.xml
vim /etc/mapred-site.xml
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
vim yarn-site.xml
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
vim master hadoop0
scp -r /home/hadoop/hadoop-2.6.4/etc/hadoop/* tx:~/hadoop-2.6.4/etc/hadoop/
启动和关闭hadoop
参考:http://my.oschina.net/penngo/blog/653049
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
[hadoop@lizer_ali hadoop-2.6.4]$ jps
3099 ResourceManager
3430 SecondaryNameNode
2879 NameNode
3470 Jps
3382 JobHistoryServer
[hadoop@lizer_tx ~]$ jps
9757 DataNode
9853 NodeManager
10064 Jps
检查节点配置情况 bin/hadoop dfsadmin -report
网页节点管理 http://114.215.92.77:8088/cluster
网页资源管理 http://114.215.92.77:50070/dfshealth.html#tab-overview
新建文件夹
bin/hdfs dfs -mkdir -p input
http://blog.csdn.net/u014595668/article/details/52079753
http://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/SingleCluster.html
http://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/ClusterSetup.html
- hadoop-2.6.4集群编译搭建-阿里云和腾讯云
- hadoop-2.6.4集群编译搭建-阿里云和腾讯云
- 阿里云搭建Hadoop集群
- 一台阿里云2台腾讯云服务器搭建Hadoop集群
- 阿里云和腾讯云,最终谁会成为中国的AWS?
- 阿里云和腾讯云免费SSL证书 - 知识林
- 腾讯云hadoop集群搭建步骤,namenode/datanode启动问题
- 阿里云搭建集群环境
- hadoop 2.6.4 伪分布集群搭建
- 微软云和阿里云的区别
- 阿里云实现Hadoop+Spark集群
- Hadoop-2.6.0集群搭建
- 搭建Hadoop-2.6.0集群
- Hadoop 2.6.0集群搭建
- Hadoop 2.6 集群方式搭建
- 搭建hadoop 2.6.0集群
- 云服务器搭建hadoop集群
- Hadoop CDH4.4集群搭建
- bash shell脚本访问PostgreSQL的三种方式
- Java信号量Semaphore整理
- MTCNN训练数据整理
- 状压-[SCOI2005]互不侵犯King
- shell调用python脚本,并且向python脚本传递参数
- hadoop-2.6.4集群编译搭建-阿里云和腾讯云
- memcached安装与基本使用
- Express修改模板引擎
- 服务器备份
- 自定义控件:SlidingMenu,侧边栏,侧滑菜单
- 在SpringMVC中使用过滤器(Filter)过滤容易引发XSS的危险字符
- 最简单的基于FFmpeg的AVDevice例子(屏幕录制)
- android事件分发机制(图易懂)
- 目标检测的图像特征提取之(一)HOG特征