Spark2.x集群安装

来源:互联网 发布:火山中文移动编程官网 编辑:程序博客网 时间:2024/05/29 05:06

Spark安装(非ClouderaManager)

由于cloudera Manager自带的Spark版本为1.6,所以此处单独安装Spark-2.1.1
1. scala环境

scp scala-2.11.11.tgz hd-26:/usr/local/ssh hd-26 "cd /usr/local/; tar xf scala-2.11.11.tgz; \rm -rf scala-2.11.11.tgz; ln -s scala-2.11.11 scala; \echo 'export SCALA_HOME=/usr/local/scala' >> /etc/profile; source /etc/profile;"
  1. spark
cp spark-2.1.1-bin-hadoop2.6.tgz /opt/softcd /opt/softtar xf spark-2.1.1-bin-hadoop2.6.tgzcd ..ln -s soft/spark-2.1.1-bin-hadoop2.6/ sparkcd spark/confcp spark-env.sh.template spark-env.shcp slaves.template  slavesecho -e "hd-26\nhd-27\nhd-28\nhd-30\n" >> slavesecho "SPARK_EXECUTOR_CORES=2" >> spark-env.shecho "SPARK_EXECUTOR_MEMORY=2G" >> spark-env.shecho "SPARK_DRIVER_MEMORY=2G" >> spark-env.shecho "SPARK_MASTER_HOST=hd-29" >> spark-env.shecho "SPARK_MASTER_PORT=7077" >> spark-env.shecho "SPARK_WORKER_CORES=4" >> spark-env.shecho "SPARK_WORKER_MEMORY=2G" >> spark-env.shecho "SPARK_WORKER_PORT=7078" >> spark-env.shecho "JAVA_HOME=/usr/local/jdk1.8.0_77" >> spark-env.shecho "SPARK_HOME=/opt/spark" >> spark-env.shecho "HADOOP_CONF_DIR=/etc/hadoop/conf" >> spark-env.shecho "SCALA_HOME=/usr/local/scala"" >> spark-env.shcd /opt/softscp -r spark-2.1.1-bin-hadoop2.6 hd-26:/opt/softssh hd-26 "cd /opt; ln -s soft/spark-2.1.1-bin-hadoop2.6/ spark"ssh hd-26 "echo 'SPARK_LOCAL_IP=hd-26' >> /opt/spark/conf/spark-env.sh"ssh hd-27 "echo 'SPARK_LOCAL_IP=hd-27' >> /opt/spark/conf/spark-env.sh"cd ../spark/sbin./start-master.sh./start-slaves.sh
  1. 相关页面
    • master
    • worker-28
    • other worker web:http://${host}:8081
原创粉丝点击