Hadoop/spark安装实战(系列篇6)迈入SPARK新世界

来源:互联网 发布:win7 ntp服务器软件 编辑:程序博客网 时间:2024/05/20 19:48

Hadoop/spark安装实战(系列篇6)迈入SPARK新世界

 spark 安装

1/spark  解压缩

[root@localhost setup_tools]# tar  -zxvf  spark-1.0.0-bin-hadoop1.tgz

2将文件移到/usr/local
[root@localhost setup_tools]# mv  spark-1.0.0-bin-hadoop1 /usr/local
3配置环境变量
[root@localhost local]# vi /etc/profile
export SPARK_HOME=/usr/local/spark-1.0.0-bin-hadoop1
生效[root@localhost local]# source  /etc/profile

4配置spark spark-env.sh
[root@localhost conf]# pwd
/usr/local/spark-1.0.0-bin-hadoop1/conf

[root@localhost conf]# ls
fairscheduler.xml.template  log4j.properties.template 

metrics.properties.template  slaves  spark-defaults.conf.template 

spark-env.sh.template

[root@localhost conf]# vi spark-env.sh

export SCALA_HOME=/usr/local/scala-2.10.4
export JAVA_HOME=/usr/local/jdk1.7.0_79
export SPARK_MASTER_IP=192.168.2.100
export SPARK_WORKER_MEMORY=512m
export HADOOP_CONF_DIR=/usr/local/hadoop-1.2.1/conf

5 vi slaves

已有
localhost

6/启动HADOOP
 

[root@localhost bin]# start-all.sh
Warning: $HADOOP_HOME is deprecated.

namenode running as process 26851. Stop it first.
localhost: datanode running as process 26949. Stop it first.
localhost: secondarynamenode running as process 27046. Stop it first.
jobtracker running as process 27124. Stop it first.
localhost: tasktracker running as process 27240. Stop it first.
[root@localhost bin]# pwd
/usr/local/hadoop-1.2.1/bin
[root@localhost bin]# jps
26949 DataNode
27046 SecondaryNameNode
26851 NameNode
27240 TaskTracker
29228 Jps
27124 JobTracker

7 启动spark

[root@localhost sbin]# pwd
/usr/local/spark-1.0.0-bin-hadoop1/sbin
[root@localhost sbin]#

[root@localhost sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to

/usr/local/spark-1.0.0-bin-hadoop1/sbin/../logs/spark-root-

org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to

/usr/local/spark-1.0.0-bin-hadoop1/sbin/../logs/spark-root-

org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
[root@localhost sbin]#

检验
[root@localhost sbin]# jps
29543 Jps
29459 Worker
26949 DataNode
29344 Master
27046 SecondaryNameNode
26851 NameNode
27240 TaskTracker
27124 JobTracker
[root@localhost sbin]#

8 开始进入 spark-shell 新世界!

[root@localhost sbin]# spark-shell

[root@localhost sbin]# spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on

classpath
15/09/12 07:38:25 INFO spark.SecurityManager: Changing view acls to:

root
15/09/12 07:38:25 INFO spark.SecurityManager: SecurityManager:

authentication disabled; ui acls disabled; users with view permissions:

Set(root)
15/09/12 07:38:25 INFO spark.HttpServer: Starting HTTP Server
15/09/12 07:38:26 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/12 07:38:26 INFO server.AbstractConnector: Started

SocketConnector@0.0.0.0:35813
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.0.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) Client VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.


9在本地物理机浏览器打开spak


URL: spark://192.168.2.100:7077
Workers: 1
Cores: 1 Total, 0 Used
Memory: 512.0 MB Total, 0.0 B Used
Applications: 0 Running, 0 Completed
Drivers: 0 Running, 0 Completed
Status: ALIVE

 

 Workers

 

Id

Address

State

Cores

Memory

 


worker-20150912073713-spark0-60571

spark0:60571

ALIVE

1 (0 Used)

 512.0 MB (0.0 B Used)

 

 

 

0 0
原创粉丝点击