hadoop2.7.4的安装

来源:互联网 发布:mac迅雷总是开机启动 编辑:程序博客网 时间:2024/06/06 09:22

1.下载hadoop2.7.4,地址http://www.apache.org/dyn/closer.cgi/Hadoop/common/hadoop-2.7.4/hadoop-2.7.4.tar.gz

2.设置ssh的免密登录

ssh-keygen -t rsa

cat ~/.ssh/id_rsa.pub>>~/.ssh/authorized_keys

ssh localhost

3.设置Hadoop的配置文件

a).core-site.xml

    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/Users/lxiong/myapp/Project/hadoop/temp/</value>
        <description>A base for other temporary directories.</description>
    </property>

b).hdfs-site.xml

    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/Users/lxiong/myapp/Project/hadoop/temp/dfs/name</value>
   </property>
   <property>
       <name>dfs.datanode.data.dir</name>
       <value>/Users/lxiong/myapp/Project/hadoop/temp/dfs/data</value>
   </property>

c).mapred-site.xml

    <property>
    <name>mapred.job.tracker</name>
    <value>localhost:9001</value>
  </property>

d).yarn-site.xml

    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>

e).hadoop-env.sh

export JAVA_HOME=/Library/Java/JavaVirtualMachines/1.8.0.jdk/Contents/Home(跟自己的jdk安装的目录有关)
export HADOOP_HOME=/Users/lxiong/myapp/install/hadoop-2.7.1(跟自己的hadoop安装的目录有关)
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib:$HADOOP_COMMON_LIB_NATIVE_DIR"


4.设置系统的环境变量

export HADOOP_HOME=/Users/lxiong/myapp/install/hadoop-2.7.4(跟自己的hadoop放置的目录有关)
export PATH=$PATH:$HADOOP/bin
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib:$HADOOP_COMMON_LIB_NATIVE_DIR"

配置完后,source一下该文件

5.在本地建立三个文件夹

/Users/lxiong/myapp/Project/hadoop/temp/dfs/name

/Users/lxiong/myapp/Project/hadoop/temp/dfs/data

/Users/lxiong/myapp/Project/hadoop/temp

6.格式化文件系统:hdfs namenode -format

7.sbin/start-all.sh

8.访问localhost:50070和localhost:8088测试是否正常

9.查看hdfs文件系统,如:hadoop fs -ls /


原创粉丝点击