spark安装

来源:互联网 发布:格式转换器 mac 编辑:程序博客网 时间:2024/06/05 09:03
scala-2.11.8.tgz
spark-2.0.0-bin-hadoop2.7.tgz

scala官网
http://www.scala-lang.org/download/2.11.8.html

spark官网
http://spark.apache.org/downloads.html

#cd /home/server/env/spark
wget http://downloads.lightbend.com/scala/2.11.8/scala-2.11.8.tgz
wget http://d3kbcqa49mib13.cloudfront.net/spark-2.0.0-bin-hadoop2.7.tgz

tar zxvf scala/2.11.8/scala-2.11.8.tgz
tar zxvf spark-2.0.0-bin-hadoop2.7.tgz
#mv scala-2.11.8 scala
#mv spark-2.0.0-bin-hadoop2.7 spark

配置环境变量 /etc/profile
export JAVA_HOME=/home/server/env/jdk/jdk1.7.0_79
export SCALA_HOME=/home/server/env/spark/scala
export SPARK_HOME=/home/server/env/spark/spark
export PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$SPARK_HOME/sbin
使环境变量生效source /etc/profile

修改spark配置
cd spark/conf
cp spark-env.sh.template spark-env.sh
cp slaves.template slaves


编辑 spark-env.sh
文件最后添加
export JAVA_HOME=/home/server/env/jdk/jdk1.7.0_79
export SCALA_HOME=/home/server/env/spark/scala
export SPARK_MASTER_IP=192.168.10.111
export SPARK_WORKER_MEMORY=15g
export HADOOP_CONF_DIR=/home/server/env/spark/spark/conf


编辑slaves
添加集群机器
#localhost
192.168.10.111
192.168.10.113


程序使用Agl用户运行
useradd Agl
Agl添加ssh key认证
su - Agl
ssh-keygen -t rsa
ssh-copy-id -i ~/.ssh/id_rsa.pub 360xh02
ssh-copy-id -i ~/.ssh/id_rsa.pub 360xh04
把spark目录的用户和组改为Agl
即drwxr-xr-x.  4 Agl      Agl        58 Aug 29 15:47 spark

113从机和主机的配置一样scp一份过去即可

配置完成测试spark是否正常
cd /home/server/env/spark/spark/sbin
./start-all.sh
spark安装 - curious - 生活有你更精彩.

jps查看多了,从机上也启动了相应的进程
master和worker
spark安装 - curious - 生活有你更精彩.
原创粉丝点击