在standalone-cluster模式上运行spark应用程序(用sbt打包)

来源:互联网 发布:中国好声音网络歌手 编辑:程序博客网 时间:2024/04/26 07:58

环境:

       在同一台机器上启动一个master和一个slave,进入$spark_home

       启动master使用 ./sbin/start-master

       启动slave使用./bin/spark-class org.apache.spark.deploy.worker.Worker spark://<sparkmaster>:7077


spark应用程序:

       创建一个应用程序目录/home/snowman/test, 创建源程序 src/main/scala/FirstProg.scala

import org.apache.spark.SparkContextimport org.apache.spark.SparkContext._object SimpleApp {  def main(args: Array[String]) {    val logFile = "/home/snowman/kexueguairen.txt" // Should be some file on your system    val sc = new SparkContext("spark://sparkmaster:7077", "Simple App", "<$SPARK_HOME>",      List("file:///home/snowman/test/target/scala-2.10/simple-project_2.10-1.0.jar"))    val logData = sc.textFile(logFile, 2).cache()    val numAs = logData.filter(line => line.contains("a")).count()    val numBs = logData.filter(line => line.contains("b")).count()    val numThes = logData.filter(line => line.contains("the")).count()    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))    println("Lines with the: %s".format(numThes))  }}

       创建sbt配置文件firstapp.sbt

name := "Simple Project"version := "1.0"scalaVersion := "2.10.3"libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

       使用sbt打包, 第一次运行时先执行sbt, 第一次运行会下载很多依赖包,比较慢,

       sbt package

       如果重新打包,可以sbt clean, 然后sbt package

       这时会生成 target/scala-2.10/simple-project_2.10-1.0.jar


       运行应用程序

         在spak_home下运行./bin/spark-class org.apache.spark.deploy.Client launch spark://sparkmaster:7077  file:///home/snowman/test/target/scala-2.10/simple-project_2.10-1.0.jar SimpleApp


        web监控

           在http://<spark_master_ip>:8080可以看到监控页面,里面有刚才运行的应用程序的结果


            程序的标准输出和错误输出可见work/dirver-***目录下的stdout和stderr



0 0