Hadoop安装

来源:互联网 发布:淘宝里面高仿手表店铺 编辑:程序博客网 时间:2024/06/05 03:47
一 Hadoop的安装步骤
第一步:准备Linux环境
第二步:安装JDK
第三步:配置Hadoop
 
二 准备Linux环境
方法一:在window上安装虚拟机,然后在虚拟机上安装Linux
方法二:租用云主机、阿里云、UnitedStack等
 
三 安装JDK和设置环境变量
[root@localhost yum.repos.d]# yum install -y java-1.7.0-openjdk.x86_64
[root@localhost Packages]# yum install -y java-1.7.0-openjdk-devel.x86_64
[root@localhost ~]# vim /etc/profile
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.141-2.6.10.1.el7_3.x86_64
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
[root@localhost ~]# source /etc/profile
[root@localhost ~]# javac -v
javac: invalid flag: -v
Usage: javac <options> <source files>
use -help for a list of possible options
[root@localhost ~]# java -version
java version "1.7.0_141"
OpenJDK Runtime Environment (rhel-2.6.10.1.el7_3-x86_64 u141-b02)
OpenJDK 64-Bit Server VM (build 24.141-b02, mixed mode)
[root@localhost ~]# javac -version
javac 1.7.0_141
 
四 安装与配置Hadoop
1、下载Hadoop
wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
2、解压缩
[root@localhost ~]# mv hadoop-1.2.1.tar.gz /opt/
[root@localhost ~]# cd /opt
[root@localhost opt]# tar -zxvf hadoop-1.2.1.tar.gz
3、需要配置的4个文件
14.png
mapred-site.xml配置为
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
hdfs-site.xml配置为
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
<property>
<name>dfs.data.dir</name>
<value>/hadoop/data</value>
</property>
</configuration>
core-site.xml配置为
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
 
<property>
<name>hadoop.tmp.dir</name>
<value>/hadoop</value>
</property>
 
<property>
<name>dfs.name.dir</name>
<value>/hadoop/name</value>
</property>
</configuration>
hadoop-env.sh配置为:
# The java implementation to use. Required.
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.141-2.6.10.1.el7_3.x86_64
4、增加环境变量
[root@localhost conf]# vim /etc/profile
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.141-2.6.10.1.el7_3.x86_64
export JRE_HOME=$JAVA_HOME/jre
export HADOOP_HOME=/opt/hadoop-1.2.1
export CLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$PATH
[root@localhost conf]# source /etc/profile
 
五 测试
[root@localhost conf]# whereis hadoop
hadoop: /opt/hadoop-1.2.1/bin/hadoop
[root@localhost conf]# hadoop namenode -format
[root@localhost ~]# start-all.sh
输入3次密码
[root@localhost current]# jps
6262 DataNode
6621 TaskTracker
6406 SecondaryNameNode
6136 NameNode
6665 Jps
6489 JobTracker
[root@localhost current]# hadoop fs -ls /
Warning: $HADOOP_HOME is deprecated.
 
Found 1 items
drwxr-xr-x - root supergroup 0 2017-08-20 11:23 /hadoop
 
六 常见问题
1、查看日志方法
http://localhost:50070/logs
2、DataNode节点无法启动
http://www.cnblogs.com/justinzhang/p/4255303.html
原创粉丝点击