Spark安装

来源:互联网 发布:小明看看永久域名2015 编辑:程序博客网 时间:2024/05/23 01:18

前提:hadoop已经安装。

.安装scala

1.解压缩

[root@yufan scala]# tar -zxvf scala-2.12.1

2.配置环境变量

修改  /etc/profile 文件,添加jSCALA_HOME跟 PASH路径,输入scala测试是否部署成功

[root@yufan scala]# vim /etc/profile[root@yufan scala]# source /etc/profile
[root@yufan scala]# scalaWelcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_92).Type in expressions for evaluation. Or try :help.scala> 


export SCALA_HOME=/home/bigdata/scala/scala-2.12.1export PATH=.:$SCALA_HOME/bin:$PATH


安装Spark

3.解压缩

[root@yufan spark]# tar -zxvf spark-2.1.0-bin-without-hadoop.tgz 

4.修改配置文件

[root@yufan spark]# cd spark-2.1.0/conf/[root@yufan conf]# cp spark-env.sh.template spark-env.sh[root@yufan conf]# cp slaves.template slaves[root@yufan conf]# lsdocker.properties.template  metrics.properties.template  spark-defaults.conf.templatefairscheduler.xml.template  slaves                       spark-env.shlog4j.properties.template   slaves.template              spark-env.sh.template
5.编辑spark-env.sh

如果安装的spark不是基于某个hadoop版本的版本,则需要添加

SPAEK_DIST_CLASSPATH属性,手动指定hadoop,hdfs的路径

export JAVA_HOME=/home/bigdata/java/jdk1.8/export SCALA_HOME=/home/bigdata/scala/scala-2.12.1export HADOOP_HOME=/home/bigdata/hadoop/hadoop-2.2export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoopexport SPARK_DIST_CLASSPATH=$(/home/bigdata/hadoop/hadoop-2.2/bin/hadoop classpath)

6.编辑slaves

添加除主节点master以外的worker结点

yufan1yufan2


7.配置环境变量 /ect/profile

export SPARK_HOME=/home/bigdata/spark/spark-2.1.0export PATH=.:$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH


8..使用 scp -r 复制到其他其他结点


9.使用 start-all.sh启动spark

10.spark-shell

[root@yufan sbin]# spark-shellSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).17/08/01 11:27:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableSpark context Web UI available at http://192.168.10.30:4040Spark context available as 'sc' (master = local[*], app id = local-1501558080783).Spark session available as 'spark'.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0      /_/         Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_92)Type in expressions to have them evaluated.Type :help for more information.scala> var file = sc.textFile("hdfs://localhost:50040/10.txt");




原创粉丝点击