hadoop 2.7.4伪分布式安装(参照官网单节点安装)

来源:互联网 发布:openwrt 网络配置 编辑:程序博客网 时间:2024/06/06 00:02


1、SSH免密码登录,因为Hadoop需要通过SSH登录到各个节点进行操作,每台服务器都生成公钥,再合并到authorized_keys(root用户)
root用户
(1)CentOS默认没有启动ssh无密登录,去掉/etc/ssh/sshd_config其中2行的注释,每台服务器都要设置,
#RSAAuthentication yes 去掉#
#PubkeyAuthentication yes 去掉#
(2)输入命令,ssh-keygen -t rsa,生成key,都不输入密码,一直回车,/root就会生成.ssh文件夹,每台服务器都要设置,
(3)合并公钥到authorized_keys文件,进入/root/.ssh目录,通过SSH命令合并
cat id_rsa.pub>> authorized_keys
chmod 644 authorized_keys

2、安装JDK,Hadoop2.8需要hadoop2 至少需要JDK7
(1)下载“jdk-8u121-linux-x64”,输入'rz' 命令 放到/usr/java目录下
(2)解压,输入命令,tar -zxvf jdk-8u121-linux-x64
(3)编辑/etc/profile 添加
export JAVA_HOME=/usr/java/jdk1.8.0_121
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin
(4)使配置生效,输入命令,source /etc/profile
(5)输入命令,java -version,完成
3、安装hadoop
(1)下载“hadoop-2.7.4.tar.gz”,放到/home/hadoop目录下
(2)解压,输入命令,tar -xzvf hadoop-2.7.4.tar.gz
(3)vi /hadoop/hadoop-2.7.4/etc/hadoop/hadoop-env.sh
  添加export JAVA_HOME=/usr/java/jdk1.8.0_121
(4)vi /etc/profile
 添加 export HADOOP_HOME=/hadoop/hadoop-2.7.4
  export PATH=.:$HADOOP_HOME/bin:$PATH
4、官网上copy的照做
#Standalone Operation
#By default, Hadoop is configured to run in a non-distributed mode, as a single Java process. This is useful for debugging.
#The following example copies the unpacked conf directory to use as input and then finds and displays every match of the given regular expression. Output is written to the given output directory.
进入/hadoop/hadoop-2.7.4/目录,输入如下命令
  $ mkdir input
  $ cp etc/hadoop/*.xml input
  $ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.4.jar grep input output 'dfs[a-z.]+'
  $ cat output/*
5、配置core-site.xml
vi etc/core-site.xml
添加
<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>
6、配置hdfs-site.xml
vi etc/hadoop/hdfs-site.xml
添加
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>
7、格式系统文件、启动hadoop
(1)$ bin/hdfs namenode -format
(2)$ sbin/start-dfs.sh

原创粉丝点击