CentOS6.7安装独立版Spark

来源:互联网 发布:用matlab求矩阵方程 编辑:程序博客网 时间:2024/06/06 16:38

1. 系统环境:
Linux:CentOS6.7
JDK: jdk-1.8.0_131
Scala: Scala-2.12.2
Spark: Spark-2.1.1


2. 安装JDK
安装JDK8并设置环境变量

[root@localhost local]# java -versionjava version "1.8.0_131"Java(TM) SE Runtime Environment (build 1.8.0_131-b11)Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)[root@localhost local]# echo $JAVA_HOME/usr/java/jdk1.8.0_131[root@localhost local]#

3. 安装Scala
1.1 下载解压

[root@localhost ~]# wget https://downloads.lightbend.com/scala/2.12.2/scala-2.12.2.tgz[root@localhost ~]# cp scala-2.12.2.tgz /usr/local/[root@localhost local]# tar -zxvf scala-2.12.2.tgz[root@localhost local]# mv scala-2.12.2 scala

2.1 配置环境变量

[root@localhost local]# vi /etc/profile

写入

export SCALA_HOME=/usr/local/scalaexport PATH=$PATH:$SCALA_HOME/bin

使配置生效

 [root@localhost local]# source /etc/profile

3.1 验证scala安装成功

[root@localhost local]# scala –version[root@localhost local]# scalaWelcome to Scala 2.12.2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131).Type in expressions for evaluation. Or try :help.scala>

4. 安装spark
4.1 下载解压
Spark官网:http://spark.apache.org/downloads.html

Spark安装包有两种选择,一种是下载预编译好的Spark,一种是源码安装。
如果想从零开始编译,则下载源码包。但是不建议这样做,因为编译的时候需要联网编译(编译工具Maven或者sbt),而Maven仓库被墙了,导致编译时需要翻墙,非常麻烦。在尝试完Maven和sbt之后,我选择下载预编译好的Spark。


 选择Spark版本之后,下面Note部分会提示支持对应版本的Spark的Scala版本范围,安装Spark之前确认Scala版本是否符合要求。

[root@localhost ~]# cp spark-2.1.1-bin-hadoop2.7.tgz /usr/local/[root@localhost ~]# cd /usr/local/[root@localhost local]# tar -zxvf spark-2.1.1-bin-hadoop2.7.tgz [root@localhost local]# mv spark-2.1.1-bin-hadoop2.7 spark

5.1 配置环境变量

[root@localhost conf]# vi /etc/profile
写入
export SPARK_HOME=/usr/local/sparkexport PATH=$PATH:$SPARK_HOME/bin
生效配置
[root@localhost conf]# source /etc/profile
6.1 设置SPARK_EXAMPLES_JAR环境变量
[root@localhost local]# vi ~/.bash_profile[root@localhost local]# vi /etc/profile
写入
export SPARK_EXAMPLES_JAR=$SPARK_HOME/examples/jars/spark-examples_2.11-2.1.1.jar
生效配置
[root@localhost conf]# source /etc/profile

7.1 修改spark配置文件
1.1.1 配置spark.sh

[root@localhost local]# vi /etc/profile.d/spark.sh
写入
export JAVA_HOME=/usr/java/jdk1.8.0_131export SCALA_HOME=/usr/local/scala
使配置生效
[root@localhost spark]# source /etc/profile.d/spark.sh
2.1.1 配置spark-env.sh
[root@localhost conf]# cp /usr/local/spark/conf/spark-env.sh.template /usr/local/spark/conf/spark-env.sh[root@localhost conf]# vi /usr/local/spark/conf/spark-env.shexport SCALA_HOME=/usr/local/scalaexport JAVA_HOME=/usr/java/jdk1.8.0_131

8.1 启动spark

[root@localhost spark]# sbin/start-all.sh starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.outlocalhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out[root@localhost spark]#


原创粉丝点击