Spark1.6.0-Hadoop2.6配置过程

来源:互联网 发布:淘宝上罂粟壳叫什么 编辑:程序博客网 时间:2024/05/16 09:08

spark目录下的conf文件夹下存放着spark的一系列配置文件
1、配置spark-env.sh

export JAVA_HOME=/usr/lib/java/jdk1.7.0_80export SCALA_HOME=/usr/lib/scala/scala-2.10.4export HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.0export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.6.0/etc/hadoopexport SPARK_MASTER_IP=masterexport SPARK_WORKER_MEMORY=4gexport SPARK_EXECUTOR_MEMORY=4gexport SPARK_DRIVER_MEMEORY=4gexport SPARK_WORKER_CORES=8

2、配置slaves

slave01slave02

3、配置spark-defaults.conf

spark.executor.extraJavaOptions   -XX:+PrintGCDetails -DKey=value -Dnumbers="one tow three"spark.eventLog.enabled            truespark.eventLog.dir                hdfs://master:9000/historyserverforSparkspark.yarn.historyServer.address  master:18080spark.history.fs.logDirectory     hdfs://master:9000/historyserverforSpark#spark.default.parallelism        100

4、配置~/.bashrc,修改完成之后source一下

export JRE_HOME=${JAVA_HOME}/jreexport HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.0export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoopexport HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/nativeexport HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib"export SCALA_HOME=/usr/lib/scala/scala-2.10.4export SPARK_HOME=/usr/local/spark/spark-1.6.0-bin-hadoop2.6export IDEA_HOME=/usr/local/idea/idea-IC-135.1306export CLASS_PATH=.:${JAVA_HOME}/lib:${JRE_HOME}/libexport PATH=${IDEA_HOME}/bin:${SPARK_HOME}/bin:${SCALA_HOME}/bin:${JAVA_HOME}/bin:${HADOOP_HOME}/bin:$PATH

5、将整个spark1.6.0-hadoop2.6以及.bashrc文件拷贝到集群中的其他节点上,并且bashrc需要source一下
6、通过hdfs创建HistoryServer的文件夹,historyserverforSpark

Hadoop dfs -mkdir /historyserverforSpark

7、启动spark和historyServer

./start-all.sh和./start-history-server.sh
0 0
原创粉丝点击