spark2.2.0安装配置

来源:互联网 发布:手机淘宝店铺首页网址 编辑:程序博客网 时间:2024/06/08 16:26

依赖环境:

java1.8
scala2.11.8
hadoop2.7.3

说明:

主机host映射:
192.168.238.100 node01
192.168.238.101 node02
192.168.238.102 node03

其中node01上安装master,node02、node03上安装worker
node01先配置ssh到node02、node03

修改配置

cd spark-2.2.0/conf/
spark-env.sh

cp spark-env.sh.template spark-env.shvi spark-env.sh末尾添加:export JAVA_HOME=/export/servers/jdk1.8.0_102export SCALA_HOME=/export/servers/scala-2.11.8export HADOOP_HOME=/export/servers/hadoop-2.7.3export HADOOP_CONF_DIR=/export/servers/hadoop-2.7.3/etc/hadoopexport SPARK_MASTER_IP=node01export SPARK_MASTER_PORT=7077export SPARK_WORKER_MEMORY=1g

slaves

cp slaves.template slavesvi slaves内容localhost替换为:node02node03

分别拷贝到节点node02,node03

scp -r /export/servers/spark-2.2.0/ bingo@node02:/export/servers/scp -r /export/servers/spark-2.2.0/ bingo@node03:/export/servers/

分别加入环境变量

vi /etc/profile# 添加export SPARK_HOME=/export/servers/spark-2.2.0export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin

分别刷新环境
source /etc/profile

在node01上启动和停止:

启动master:start-master.sh启动所有slave:start-slaves.sh停止:stop-master.shstop-slaves.shweburl:http://node01:8080/
原创粉丝点击